[R] question about running out of memory on R -- memory.limit change in R 2.6.0?

Earl F. Glynn efg at stowers-institute.org
Tue Nov 6 19:38:07 CET 2007


"jim holtman" <jholtman at gmail.com> wrote in message 
news:644e1f320711051542s51a7e793t8cbc1f8e7cc35b at mail.gmail.com...
> --max-mem-size=N
> (Windows only) Specify a limit for the amount of memory to be used
> both for R objects and working areas. This is set by default to the
> smaller of 1.5Gb24 and the amount of physical RAM in the machine, and
> must be between 32Mb and 3Gb.

Something seems to have changed in R 2.6.0 -- the limit is no longer 3 GB.

On a PC with 3.5 GB with R 2.5.1 using default command line:

Default

> memory.limit()

[1] 1610612736



With --max-mem-size=3000M on the command line:



> memory.limit()

[1] 3145728000






Let's see what happens with R 2.6.0 on the same 3.5 GB PC (Windows XP):



Default command line:


> memory.limit()

[1] 1535.875



Note:  The format of the return value is now in MB instead of bytes.



With --max-mem-size=3000M on the command line:



Error message:  WARNING: --max-mem-size=3000M: too large and taken as 2047M



> memory.limit()

[1] 2047.875



This value is less than the value shown in R 2.5.1 (after converting both to 
the same basis).



But was this change a "fix" to report that a Windows process only has a 2 GB 
address space?  That is, the value returned by R 2.5.1 of 3145728000 wasn't 
strictly correct?



efg



Earl F. Glynn

Scientific Programmer

Stowers Institute



More information about the R-help mailing list