[Rd] Estimate actual memory usage, not cumulative allocated

Renaud Gaujoux getoxxx at gmail.com
Sun Feb 7 17:54:48 CET 2010


Hi Sean,

I know I'll have to optimize the memory management (maybe using the
proto or R.oo packages), but for the moment I'd like to estimate the
amount of memory actually used by the call.

I got some estimate doing:

g1 <- gc(reset=TRUE)
my.function(input.data)
g2 <- gc();
sum(g1[,6] - g2[,2]);
# -> sum differences between max used memory and last current used
memory (merging Ncells and Vcells)

Does it make sense?

I was happy with it, but the thing is that it does not seem to depend on
the size of the input.data, which is a matrix that is involved
internally in matrix products. Even reducing the input.data from 5000
rows to 50 rows did not change the result: ~ 20 Mb for each.
Something to do with the garbage collector trigger I imagine?

Thanks,
Renaud

Sean O'Riordain wrote:
> Renaud,
>
> I could be wrong... but generally in R you create a new object each
> time you do an assignment which is why looping is slow; i.e. you're
> not actually updating you're creating a new version of the original.
>
> cheers,
> Sean
>
>
> On Sun, Feb 7, 2010 at 2:47 PM, Renaud Gaujoux <getoxxx at gmail.com
> <mailto:getoxxx at gmail.com>> wrote:
>
>     Hi,
>
>     I'd like to know how estimate the memory actually used by some
>     function
>     call.
>     The function essentially contains a for loop, which stops after a
>     variable number of iterations, depending on the input data.
>     I used Rprof with memory.profiling=TRUE, but the memory results
>     seem to
>     increase with the number of iterations. What I understand is that the
>     reported memory is cumulative and based on allocation (am I  right?).
>     After each loop the same object is updated so the I'm not expecting
>     memory usage from this part.
>     however, the update computation itself needs to allocate memory for
>     temporary objects, which would be the reason why the reported memory
>     increases with the number of iterations.
>     I just want to know the minimum amount of memory I need to run the
>     computation, which would not be the sum of memory allocated during the
>     computation (as some is temporary and should be released by R if
>     necessary)
>     How could I estimate this value?
>
>     #Sample code:
>
>     x <- my.object
>     for( i in 1:500 ){
>        # compute next value
>        x <- update(x)
>
>        # stop depending on the current value
>        if( stop(x) ) break;
>     }
>
>
>     Thanks,
>     Renaud
>
>     ______________________________________________
>     R-devel at r-project.org <mailto:R-devel at r-project.org> mailing list
>     https://stat.ethz.ch/mailman/listinfo/r-devel
>
>



More information about the R-devel mailing list