[R] Does R accumulate memory
Duncan Murdoch
murdoch at stats.uwo.ca
Sat Jan 8 23:36:47 CET 2005
On Sat, 8 Jan 2005 16:38:31 -0500, "Doran, Harold" <HDoran at air.org>
wrote:
>Dear List:
>
>I am running into a memory issue that I haven't noticed before. I am
>running a simulation with all of the code used below. I have increased
>my memory to 712mb and have a total of 1 gb on my machine.
>
>What appears to be happening is I run a simulation where I create 1,000
>datasets with a sample size of 100. I then run each dataset through a
>gls and obtain some estimates.
>
>This works fine. But, when I view how much memory is being used in
>Windows, I see that it does not reduce once the analysis is complete. As
>a result, I must quit R and then perform another analysis.
If you ask Windows how much memory is being used, you'll likely get an
incorrect answer. R may not release memory back to the OS, but it may
be available for re-use within R.
Call gc() to see how much memory R thinks is in use.
>So for example, before starting the 1st simulation, my windows task
>manager tells me I am using 200mb of memory. After running the first
>simulation it may go up to 500mb. I then try and run another simulation
>with a larger sample size, but I quickly run out of memory because it
>starts at 500 and increases from there and the simulation halts.
The difficulty you're running into may be memory fragmentation. When
you run with a larger sample size, R will try to allocate larger
chunks than it did originally. If the "holes" created when the
original simulation is deleted are too small, R will need to ask
Windows for new memory to store things in.
You could try deleting everything in your workspace before running the
2nd simulation; this should reduce the fragmentation. Or you could
run the big simulation first, then the smaller one will fit in the
holes left from it.
Duncan Murdoch
More information about the R-help
mailing list