[R] R and large RAM

Thomas Lumley thomas at biostat.washington.edu
Fri Jul 7 18:06:59 CEST 2000

On Fri, 7 Jul 2000, Joerg Kindermann wrote:

> >>>>> " " == Dipl -Stat Detlef Steuer <steuer at statistik.uni-dortmund.de> writes:
>      > Hi!  On 07-Jul-2000 Gerhard Paass wrote:
>     >> I am planing to buy several machines with 2 GB RAM.  Is R able to
>     >> use this much memory?
>      > Yes! We are happily running R on two Enterprise Servers with 2GB
>      > Ram.  R was behaving well if asked to comsume a lot of this RAM to
>      > work on large datasets.
> This does not mean that R actually uses that much space. From exprimenting
> with the nsize and vsize command line parameters we know that the current
> (R-1.1.0) limits are somewhere at nsize=19.5M and vsize=990M. On a SUN
> running Solaris2.7 this will require about 1400M of memory. 
> You can't  request larger sizes of memory for R-internal reasons. Our
> question is whether this is likely to be changed in the near future.

I can get vsize up to nearly 2Gb under Solaris 2.8
> gc()
            free     total (Mb)
Ncells  19777754  19922944  380
Vcells 248921011 249036800 1900

The limit of 2Gb for vsize is because 32bits is the size of the variables

The nsize limit seems wrong to me.  It is coded in src/unix/sys-common.c
as 20000000 (ie 2x10^7), but this allows each node to take up 100 bytes
and still fit below the 2^32-1 limit. Surely a larger value is possible.

In principle it should be possible to extend both these sizes to 2^64-1
bytes on machines with 64-bit longs. I don't know how many places have
hard-coded 32bit assumptions.  However, I don't think that any of R-core
routinely use a 64bit compiler, which might be the biggest barrier.


r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch

More information about the R-help mailing list