[R] Memory hungry routines
Duncan Murdoch
murdoch.duncan at gmail.com
Mon Dec 29 20:32:50 CET 2014
On 29/12/2014 1:52 PM, ALBERTO VIEIRA FERREIRA MONTEIRO wrote:
> Is there any way to detect which calls are consuming memory?
The Rprofmem() function can do this, but you need to build R to enable
it. Rprof() does a more limited version of the same thing if run with
memory.profiling = TRUE.
Duncan Murdoch
>
> I run a program whose global variables take up about 50 Megabytes of
> memory, but when I monitor the progress of the program it seems to
> allocating 150 Megabytes of memory, with peaks of up to 2 Gigabytes.
>
> I know that the global variables aren't "copied" many times by the
> routines, but I suspect something weird must be happening.
>
> Alberto Monteiro
>
> PS: the lines, below, count the memory allocated to all global
> variables, probably it could be adapted to track the local variables:
>
> y <- ls(pat="") # get all names of the variables
> z <- rep(0, length(y)) # create array of sizes
> for (i in 1:length(y)) z[i] <- object.size(get(y[i])) # loop: get all
> sizes (in bytes) of the variables
> # BTW, is there any way to vectorialize the above loop?
> xix <- sort.int(z, index.return = TRUE) # sort the sizes
> y <- y[xix$ix] # apply the sort to the variables
> z <- z[xix$ix] # apply the sort to the sizes
> y <- c(y, "total") # add a totalizator
> z <- c(z, sum(z)) # sum them all
> cbind(y, z) # ugly way to list them
>
> ______________________________________________
> R-help at r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>
More information about the R-help
mailing list