[Rd] Estimate actual memory usage, not cumulative allocated

Renaud Gaujoux getoxxx at gmail.com
Sun Feb 7 15:47:18 CET 2010


I'd like to know how estimate the memory actually used by some function
The function essentially contains a for loop, which stops after a
variable number of iterations, depending on the input data.
I used Rprof with memory.profiling=TRUE, but the memory results seem to
increase with the number of iterations. What I understand is that the
reported memory is cumulative and based on allocation (am I  right?).
After each loop the same object is updated so the I'm not expecting
memory usage from this part.
however, the update computation itself needs to allocate memory for
temporary objects, which would be the reason why the reported memory
increases with the number of iterations.
I just want to know the minimum amount of memory I need to run the
computation, which would not be the sum of memory allocated during the
computation (as some is temporary and should be released by R if necessary)
How could I estimate this value?

#Sample code:

x <- my.object
for( i in 1:500 ){
    # compute next value
    x <- update(x)

    # stop depending on the current value
    if( stop(x) ) break;


More information about the R-devel mailing list