[Rd] Memory management problem?

apjaworski@mmm.com apjaworski@mmm.com
Tue, 12 Jun 2001 12:48:17 -0500

I just ran into the following problem.  I am not sure what causes it, but
here is the scenario:

(1) I start R with 512Mb of memory.  (I am on an Win2000 PC with 512Mb of
physical memory)
(2) I put a couple of small functions in my workspace and I generate two
real vectors x and y of length 2^19 (542288) each.
(3) I save this workspace.  The size of the saved file is about 8.4Mb.
(4) I run lm to fit y vs. x using an 11 parameter model.  I generate a
couple of plots.  Everything is fine.  (BTW, this runs out of memory with
standard 256Mb allocation of maximum memory).
(5) I regenerate y (it is a simulated example) overwriting the old one.
(6) I run (4) again and I am getting:
     Error: cannot allocate vector of size 45056kb
     Reached total allocation of 512Mb
      This happens even if I run gc() after (5).
(7) Now comes an interesting part.  If I save the image right now the size
of it is about 159Mb although I do not have any new objects in my workspace
(there is a hidden object .Trace, but it is  small).
(8) If I do
     rm(list=ls(pos=1), pos=1)
      however, the size of the workspace comes back to almost zero.

I am not sure if this happens on Linux since my Linux box only has 192Mb of

Is this the new memory management problem?  Is there any way around it?

Thanks in advance,


Andy Jaworski
Engineering Systems Technology Center
3M Center, 518-1-01
St. Paul, MN 55144-1000
E-mail: apjaworski@mmm.com
Tel:  (651) 733-6092
Fax:  (651) 736-3122

r-devel mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-devel-request@stat.math.ethz.ch