[R] cox memory
David Winsemius
dwinsemius at comcast.net
Tue Sep 22 14:04:20 CEST 2009
On Sep 21, 2009, at 7:27 PM, Λεωνίδας Μπαντής wrote:
>
> Hi there,
>
> I have a rather large data set and perform the following cox model:
>
>
> test1 <- list(tstart,tstop,death1,chemo1,radio1,horm1)
> out1<-coxph( Surv(tstart,tstop, death1) ~ chemo1+chemo1:log(tstop
> +1)+horm1+horm1:log(tstop+1)+age1+grade1+grade1:log(tstop
> +1)+positive1+positive1:log(tstop+1)+size1+size1:log(tstop+1), test1)
> out1
>
> Up to here everything works fine (with each covariate having a
> length of 289205)
> Now I want to see a specific profile of the above model so I ask for:
>
> x11()
> profilbig2
> =(survfit(out1,newdata=data.frame(chemo1=rep(0,length(chemo1)),
> horm1=rep(0,length(chemo1)),
> age1=rep(mean(age1),length(chemo1)),
> grade1=rep(0,length(chemo1)),
> positive1=rep(1,length(chemo1)),
> size1=rep(mean(size1),length(chemo1)) )))
> plot(profilbig2,col="blue")
I am a bit puzzled here. I do not see much variation within the
newdata object. If my wetware R interpreter is working, then I wonder
if you couldn't just use:
newdata=data.frame(chemo1=0,
horm1=0,
age1=mean(age1),
grade1=0,
positive1=1,
size1=mean(size1) )
>
> and I get the following error:
>
> Error: cannot allocate vector of size 1.5 Gb
> In addition: Warning messages:
> 1: In vector("double", length) :
> Reached total allocation of 1535Mb: see help(memory.size)
> 2: In vector("double", length) :
> Reached total allocation of 1535Mb: see help(memory.size)
> 3: In vector("double", length) :
> Reached total allocation of 1535Mb: see help(memory.size)
> 4: In vector("double", length) :
> Reached total allocation of 1535Mb: see help(memory.size)
>
>
> I am wondering why is that happening since I manage to fit the
> model. Shouldn't the memory problem pop up earlier when I was
> fitting the model? So now I fit the model but still cannot study a
> ceratin profile?? Can anyone suggest something? I am not an advanced
> user of R, do I type something wrong or can I do something more
> clever to see a profile of a hypothetical subject?
I don't claim to be an advanced user, but I have had the same
question. My tentative answer is that the coxph created object does
not save the "baseline survival estimate" and that survfit needs to
recreate it. I wonder whether you need to use a newdat object that is
quite so long?
>
> Thanx in advance for any answers..
> (2 GB RAM).
>
> P.S. I noticed in the help the "--max-mem-size" but I am not quite
> sure how to use it..
It's going to depend on your OS which you have not mentioned. If it is
Windoze then see your OS-specific FAQ. If Linux or Mac then the only
answer (assuming you are not satisfied with a solution that uses say
100 or 1000 points) is buy more memory.
--
David Winsemius, MD
Heritage Laboratories
West Hartford, CT
More information about the R-help
mailing list