[R] strange behaviour of memory management

Prof Brian Ripley ripley at stats.ox.ac.uk
Thu Oct 20 10:51:46 CEST 2005


On Thu, 20 Oct 2005, Meinhard Ploner wrote:

> Hi all!
> My system: R 2.1.1, Mac OS X 10.4.2.
>
> I have a very memory-consuming job for R, consisting of a function
> calling some other functions, working often with matrices of size
> 100.000 x 300. If I call the job directly after starting R the job
> takes overall 40min, however only 7min of process time. I assume the
> large difference is through memory-handling which doesn't count as
> process time. If i start the job after I make some shorter runs, some
> programming, then the job stops by reaching a memory limit. It seems
> that R doesn't release all the memory even if it don't adds global
> objects.

What is the message?  Most often this happens not because memory is not 
available but because contiguous memory is not available.  You have only 
3Gb (or less) of process address space, and that can get fragmented enough 
not to find holes for objects of about 240Mb each (assuming numerical 
matrices).

> Further I'm interesited if for a UNIX-derivate like Mac OS X gc() or
> rm(localObjects) (used in local functions) make any difference/
> advantage??

gc() unlikely (R has probably already tried that, but we haven't seen the 
message).  rm(localObjects): yes it can help, even for small objects.

If my guess is right, the real answer is a 64-bit OS.

-- 
Brian D. Ripley,                  ripley at stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford,             Tel:  +44 1865 272861 (self)
1 South Parks Road,                     +44 1865 272866 (PA)
Oxford OX1 3TG, UK                Fax:  +44 1865 272595




More information about the R-help mailing list