[R] Testing memory limits in R??
Peter Dalgaard
p.dalgaard at biostat.ku.dk
Tue Jul 7 08:05:57 CEST 2009
Duncan Murdoch wrote:
> On 06/07/2009 4:16 PM, Peter Dalgaard wrote:
>> Scott Zentz wrote:
>>> Hello Everyone,
>>>
>>> We have recently purchased a server which has 64GB of memory
>>> running a 64bit OS and I have compiled R from source with the
>>> following config
>>>
>>> ./configure --prefix=/usr/local/R-2.9.1 --enable-Rshlib
>>> --enable-BLAS-shlib --enable-shared --with-readline --with-iconv
>>> --with-x --with-tcktk --with-aqua --with-libpng --with-jpeglib
>>>
>>> and I would like to verify that I can use 55GB-60GB of the 64GB of
>>> memory within R. Does anyone know how this is possible? Will R be
>>> able to access that amount of memory from a single process? I am not
>>> an R user myself but I just wanted to test this before I turned the
>>> server over to the researchers..
>>
>> Hmm, it's slightly tricky because R often duplicates objects, so you
>> may hit the limit only transiently. Also, R has an internal 2GB limit
>> on single vectors. But something like this
>
> Is it a 2 GB limit in size, or in the number of elements? I'm still
> spending almost all my time in 32 bit land, so it's hard to check.
It's in length. I was getting a couple of wires crossed there.
-p
> Duncan Murdoch
>
>>
>> Y <- replicate(30, rnorm(2^28-1))
>>
>> should create an object of about 30*2GB. Then lapply(Y, mean) should
>> generate 30 very good and very expensive approximations to 0.
>>
>> (For obvious reasons, I haven't tested this on a 1GB ThinkPad X40....)
>>
>>
>
--
O__ ---- Peter Dalgaard Øster Farimagsgade 5, Entr.B
c/ /'_ --- Dept. of Biostatistics PO Box 2099, 1014 Cph. K
(*) \(*) -- University of Copenhagen Denmark Ph: (+45) 35327918
~~~~~~~~~~ - (p.dalgaard at biostat.ku.dk) FAX: (+45) 35327907
More information about the R-help
mailing list