[R] Once again in search of memory
Eric Lecoutre
lecoutre at stat.ucl.ac.be
Wed May 26 12:23:32 CEST 2004
Hello any R user,
Working on R 1.8.1 or R 1.9 on Windows, 256 Mb RAM.
I am trying to debug/correct/speedup the execution of one of our student's
program. Basically, we perform simulations.
R is launched with high memory-limits.
One execution of the program requires nearly 200 Mbs and is OK the first
time. Then, launching it again does strange things: seen from Windows
manager, the RAM used by R never exceed those 200 Mbs (often near 130Mb).
seen from R:
> gc(TRUE)
Garbage collection 280 = 85+59+136 (level 2) ...
396784 cons cells free (40%)
145.7 Mbytes of heap free (53%)
used (Mb) gc trigger (Mb)
Ncells 587240 15.7 984024 26.3
Vcells 16866491 128.7 35969653 274.5
And then each new call to the function
foo()
will always imply a grow of this memory use (Vcells)
How comes Windows Manager states R only uses 71 Mb RAM as seen in the
attached screenshot?
Is this a known issue? Is there any tip to "really" release memory for all
those objects we dont use anymore?
I tried also memory.profile() which states there are more list-type
objects. We wanipulated matrices. Could they come from calls to 'apply'?
Thansk for insights and advices on how to handle memory. I am turning and
turning round on help pages.
Eric
Eric Lecoutre
UCL / Institut de Statistique
Voie du Roman Pays, 20
1348 Louvain-la-Neuve
Belgium
tel: (+32)(0)10473050
lecoutre at stat.ucl.ac.be
http://www.stat.ucl.ac.be/ISpersonnel/lecoutre
If the statistics are boring, then you've got the wrong numbers. -Edward
Tufte
More information about the R-help
mailing list