[Rd] Memory management problem?

Prof Brian Ripley ripley@stats.ox.ac.uk
Tue, 12 Jun 2001 19:35:13 +0100 (BST)


On Tue, 12 Jun 2001, Prof Brian D Ripley wrote:

> On Tue, 12 Jun 2001 apjaworski@mmm.com wrote:
>
> > I just ran into the following problem.  I am not sure what causes it, but
> > here is the scenario:
> >
> > (1) I start R with 512Mb of memory.  (I am on an Win2000 PC with 512Mb of
> > physical memory)
> > (2) I put a couple of small functions in my workspace and I generate two
> > real vectors x and y of length 2^19 (542288) each.
> > (3) I save this workspace.  The size of the saved file is about 8.4Mb.
> > (4) I run lm to fit y vs. x using an 11 parameter model.  I generate a
> > couple of plots.  Everything is fine.  (BTW, this runs out of memory with
> > standard 256Mb allocation of maximum memory).
> > (5) I regenerate y (it is a simulated example) overwriting the old one.
> > (6) I run (4) again and I am getting:
> >      Error: cannot allocate vector of size 45056kb
> >      Reached total allocation of 512Mb
> >       This happens even if I run gc() after (5).
> > (7) Now comes an interesting part.  If I save the image right now the size
> > of it is about 159Mb although I do not have any new objects in my workspace
> > (there is a hidden object .Trace, but it is  small).
> > (8) If I do
> >      rm(list=ls(pos=1), pos=1)
> >       however, the size of the workspace comes back to almost zero.
> >
> > I am not sure if this happens on Linux since my Linux box only has 192Mb of
> > memory.
> >
> > Is this the new memory management problem?  Is there any way around it?
>
> No, it's an old one, called fragmentation, I expect.  There may be no
> free space of size 45Mb, and remember R no longer moves objects.  Also,
> the allocation over 256Mb may well be fragmented.
>
> I think you have the solution already: remove objects you are no longer
> using.

Or increase the memory allocation to 2Gb: since you have big objects, NT's
virtual memory should do a good job.  That limit is really only there to
avoid paging on small systems, especially ones with VFAT file systems
(you are running NTFS I hope).  You may still get allocation problems.



-- 
Brian D. Ripley,                  ripley@stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford,             Tel:  +44 1865 272861 (self)
1 South Parks Road,                     +44 1865 272860 (secr)
Oxford OX1 3TG, UK                Fax:  +44 1865 272595

-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-devel mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-devel-request@stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._