[R] Memory failure!!!!

Prof Brian Ripley ripley at stats.ox.ac.uk
Mon Aug 9 13:19:31 CEST 2004


On Mon, 9 Aug 2004, Monica Palaseanu-Lovejoy wrote:

> I am trying to increase the memory R can use. I am running R 
> under Windows on a machine with 2 GB of physical RAM and 4GB 
> of paged memory.
> 
> I wrote in the R property windows --sdi --max-mem-size=4094M, 
> but the R itself when it is doing a bayesian modelling (geoR) it 
> stops at 1,096K and i get memory errors because it cannot 
> allocate a new segment of about 500K of memory.

Please read the rw-FAQ.  Windows cannot allocate 4Gb to a user process, 
and unless you have followed the instructions there you are limited to 
2Gb (and probably a bit less).

Also please read the posting guide and report the exact message you got,
as it requests. Unless it says something about memory limits, it is
because the memory could not be obtained from the OS.

> I don't have Visual Basic so i cannot use the other commands 
> suggested in Help.

What has Visual Basic to do with this?

> Also, if i am using memory.size(max=TRUE) i get a value 
> corresponding to about 1024K, and if i am using 
> memory.limit(size=NA) i get a value of about 4000K.
> 
> How can i force R to use more memory?

By making it available to R.  Try rebooting your machine before running R, 
to reduce memory fragmentation.

-- 
Brian D. Ripley,                  ripley at stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford,             Tel:  +44 1865 272861 (self)
1 South Parks Road,                     +44 1865 272866 (PA)
Oxford OX1 3TG, UK                Fax:  +44 1865 272595




More information about the R-help mailing list