[R] Memory issue

kMan kchamberln at gmail.com
Thu May 6 04:34:37 CEST 2010


Dear Alex,

Has manual garbage collection had any effect?

Sincerely,
KeithC.

-----Original Message-----
From: Alex van der Spek [mailto:doorz at xs4all.nl] 
Sent: Wednesday, May 05, 2010 3:48 AM
To: r-help at r-project.org
Subject: [R] Memory issue

Reading a flat text file 138 Mbyte large into R with a combination of scan
(to get the header) and read.table. After conversion of text time stamps to
POSIXct and conversion of integer codes to factors I convert everything into
one data frame and release the old structures containing the data by using
rm().

Strangely, the rm() does not appear to reduce the used memory. I checked
using memory.size(). Worse still, the amount of memory required grows. 
When I save an image the .RData image file is only 23 Mbyte, yet at some
point in to the program, after having done nothing particularly difficult
(two and three way frequency tables and some lattice graphs) the amount of
memory in use is over 1 Gbyte.

Not yet a problem, but it will become a problem. This is using R2.10.0 on
Windows Vista.

Does anybody know how to release memory as rm(dat) does not appear to do
this properly.

Regards,
Alex van der Spek



More information about the R-help mailing list