[R] Testing memory limits in R??
    Marc Schwartz 
    marc_schwartz at me.com
       
    Tue Jul  7 04:05:48 CEST 2009
    
    
  
On Jul 6, 2009, at 8:39 PM, Duncan Murdoch wrote:
> On 06/07/2009 4:16 PM, Peter Dalgaard wrote:
>> Scott Zentz wrote:
>>> Hello Everyone,
>>>
>>>   We have recently purchased a server which has 64GB of memory  
>>> running a 64bit OS and I have compiled R from source with the  
>>> following config
>>>
>>> ./configure --prefix=/usr/local/R-2.9.1 --enable-Rshlib --enable- 
>>> BLAS-shlib --enable-shared --with-readline --with-iconv --with-x -- 
>>> with-tcktk --with-aqua --with-libpng --with-jpeglib
>>>
>>> and I would like to verify that I can use 55GB-60GB of the 64GB of  
>>> memory within R. Does anyone know how this is possible? Will R be  
>>> able to access that amount of memory from a single process? I am  
>>> not an R user myself but I just wanted to test this before I  
>>> turned the server over to the researchers..
>> Hmm, it's slightly tricky because R often duplicates objects, so  
>> you may hit the limit only transiently. Also, R has an internal 2GB  
>> limit on single vectors. But something like this
>
> Is it a 2 GB limit in size, or in the number of elements?  I'm still  
> spending almost all my time in 32 bit land, so it's hard to check.
>
> Duncan Murdoch
I believe that Peter is generically referring to the vector length  
limit and not a RAM limit. The only figure that I have ever seen  
referenced over the years is the limit on the length of a vector,  
given that R uses signed 32 bit integers for indexing:
# 2 Gb
 > 2 * 1024^3
[1] 2147483648
# Figure referenced in ?"Memory-limits"
 > 2^31 - 1
[1] 2147483647
Since matrices and arrays are vectors with 'dim' attributes, these  
objects have this limit as well.
HTH,
Marc
    
    
More information about the R-help
mailing list