[R] size limit in R?

Roland Rau roland.rproject at gmail.com
Mon May 21 19:57:46 CEST 2007


j-simpson at northwestern.edu wrote:
 > Hi,
 >
 > Please see the email exchanges below.  I am having trouble generating 
output that is large enough
 > for our needs, specifically when using the GaussRF function. 
However, when I wrote Dr. Schlather
 > (the author of the GaussRF function), he indicated that there is also 
a limit imposed by R itself.
 > Is this something that we can overcome?
 >

I could be wrong, but I think you did not provide information on your 
platform. Assuming it is Win32, it is an FAQ, please see:
http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-be-a-limit-on-the-memory-it-uses_0021


 >
 >> x <- numeric( 200 / 0.025 * 1450 / 0.025)
 > Error: cannot allocate vector of size 3625000 Kb
 >

You will hit memory limits rather quickly if you want to allocate not 
only one of your 3.6GB vectors - and this is neither the fault of R nor 
of Win32.
Although I don't have a background in Computer Science, I think the 
physical limit to address memory on a 32bit system is 4GB.
 > 2^32/(1024*1024*1024)
[1] 4


I hope this helps?
Roland

(And I hope I did not claim anything wrong about 32bit systems)



More information about the R-help mailing list