[R] probs in memory

j_turner@ihug.co.nz j_turner at ihug.co.nz
Wed May 15 01:38:05 CEST 2002


> glm.i01<-glm(U~y+v+i+b+v:y+v:i+v:b+y:i+y:b+i:b,family=Gamma(link="log"))
> Error: cannot allocate vector of size 153594 Kb

R grabs memory as it needs it.  Sometimes it asks for a little, sometimes a lot.
The words "153594 Kb" mean that just now, R tried to grab 153594 Kb.  Not total 
memory; just this most recent grab.  And it couldn't do it, becuase your 
machine didn't have that memory to give.

Run that session again, and run "top" in one xterm window as do.  Order the 
display by memory size - when you start using the swap, it'll slow down a lot.  
You should see the memory get gobbled up as it goes - it's quite interesting to 
watch.

> I thougth this was not suppose to happen in version 1.5 ?! 

The people who write R are very, very smart, but they still aren't smart enough 
to write software that successfully uses more memory than your machine has.

> How can I solve this problem ?? 

1) Pay more attention to your workspace, and see if you have large objects 
lying around that you don't need, or
2) Buy more RAM, or
3) use a subset of the data, and see if it can finish and give meaningful 
results, and repeat for other subsets, or 
4) write your own analysis in C or FORTRAN that's more memory efficient. ;)

Cheers

Jason

-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._



More information about the R-help mailing list