[R] running out of memory

Uwe Ligges ligges at statistik.uni-dortmund.de
Thu Feb 17 08:50:27 CET 2005


Stephen Choularton wrote:

> Hi
>  
> I am trying to do a large glm and running into this message.  
>  
> Error: cannot allocate vector of size 3725426 Kb
> In addition: Warning message: 
> Reached total allocation of 494Mb: see help(memory.size)
>  
> Am I simply out of memory (I only  have .5 gig)?
>  
> Is there something I can do?

You have to rethink whether the analyses you are doing is sensible this 
way, or whether you can respecify things. R claims to need almost 4Gb(!) 
for the next memory allocation step, so you will get in trouble even on 
huge machines....

  Uwe Ligges


> Stephen
> 
> 	[[alternative HTML version deleted]]
> 
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html




More information about the R-help mailing list