[R] R beyond 2 Gb

ripley@stats.ox.ac.uk ripley at stats.ox.ac.uk
Sat Jun 1 09:26:51 CEST 2002


On Fri, 31 May 2002, Felix Hernandez Campos wrote:

> I'm working with a fairly massive data set and I need to allocate memory
> beyond the 2GB limit. I'm using R 1.5.0 on a SunOS 5.8 machine (16 GB RAM).
> Has anyone been succesful at patching the code to get rid of the 2GB memory
> limit? (there are quite a few places in which vsize is an int or tested
> against INT_MAX). I'll appreciate any guidance you can provide. Thanks.

What 2GB limit?  The vsize in memory.c is in units of VECREC (which on
almost all platforms are 8bytes), so those comparisons are `limiting' you
to 16GB.  AFAIK it is only the code that can be used to set the limits at
the command-line which is limited to 2048M *on a 32-bit platform*, but R
can be compiled on Solaris to use 64-bit pointers and longs (see the
R-admin manual), or even to use 64-bit integers.

That's not to say that there are not other limits that we did not intend,
but my understanding is that R has been used beyond 4GB on Solaris.
I've always found it simpler to re-write my code to use less memory, e.g.
to process data in chunks.

-- 
Brian D. Ripley,                  ripley at stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford,             Tel:  +44 1865 272861 (self)
1 South Parks Road,                     +44 1865 272860 (secr)
Oxford OX1 3TG, UK                Fax:  +44 1865 272595

-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._



More information about the R-help mailing list