[R-sig-eco] Quantreg / R memory issue
Kingsford Jones
kingsfordjones at gmail.com
Fri Jan 29 23:41:53 CET 2010
Hi Dan,
Because quantreg is well tested I suspect a memory leak (if one
exists) is occuring during the iteration and randomization. Try
entering the loop w/ a debugger (e.g. the debug function) and monitor
objects are created and their sizes (see function object.size and the
functions pasted at the end of this msg), and see that unneeded
objects are removed and cleaned up (the gc function may be helpful).
If you want to take things a step further and monitor the internals of
the rq function (i.e. the calls to Fortan) then an external debugger
such as valgrind is needed.
As for the random timings of the error, there are many factors that
will affect the size and availability of continuous chunks of free
memory on your system.
hope it helps,
Kingsford Jones
(ps -- I'm curious how permutations help w/ the lack of independence)
# enhanced list of objects
# credit to P Pikal and D Hinds
.ls.objects <- function (pos = 1, pattern, order.by,
decreasing=FALSE, head=FALSE, n=5) {
napply <- function(names, fn) sapply(names, function(x)
fn(get(x, pos = pos)))
names <- ls(pos = pos, pattern = pattern)
obj.class <- napply(names, function(x) as.character(class(x))[1])
obj.mode <- napply(names, mode)
obj.type <- ifelse(is.na(obj.class), obj.mode, obj.class)
obj.size <- napply(names, object.size)
obj.dim <- t(napply(names, function(x)
as.numeric(dim(x))[1:2]))
vec <- is.na(obj.dim)[, 1] & (obj.type != "function")
obj.dim[vec, 1] <- napply(names, length)[vec]
out <- data.frame(obj.type, obj.size, obj.dim)
names(out) <- c("Type", "Size", "Rows", "Columns")
if (!missing(order.by))
out <- out[order(out[[order.by]], decreasing=decreasing), ]
if (head)
out <- head(out, n)
out
}
# shorthand
.ls <- function(..., n=10) {
.ls.objects(..., order.by="Size", decreasing=TRUE, head=TRUE, n=n)
}
#example
# .ls()
On Wed, Jan 27, 2010 at 7:46 PM, Dan Rabosky <drabosky at berkeley.edu> wrote:
>
> Hi -
>
> I've recently hit an apparent R issue that I cannot resolve (or understand, actually). This is possibly more computer science but probably relevant to others using quantreg for ecological analysis.
>
> I am using quantreg to fit a vector of quantiles to a dataset, approx 200-400 observations. To accommodate some autocorrelation issues, I have to assess significance with randomization. The problem is that I consistently observe what appears to be a memory problem causing an R crash. The problem occurs within a local function I am using to (i) randomize the data and (ii) run quantile regression on the randomized dataset.
>
> The crash only occurs (or so it seems) when I try send rq() a vector of quantiles to fit. Even when I set the random number seed, the crash occurs on different iterations of the simulation. It sometimes occurs before rq() is called within the local function, and sometimes after rq() is called within the local function. Sometimes it occurs after returned to the main function. It does occur at approximately the same iteration, though.
>
> I cannot explain this. It seems like using method="fn" instead of "br" avoids the crash, but have not rigorously investigated this and would like to understand the problem. My dataset is well-within the size range for which "br" is recommended. Is it possible that this problem is truly so memory intensive? Seems like not that many points. And why does this occur at roughly the same iteration every time? That would suggest that the memory issue is cumulative - shouldn't any memory consumed within rq(...) be freed up after I return???
>
> This is occurring with R 2.10.1 on a 64 bit machine running OSX 10.6.2 (6 GB RAM).
>
> Thanks!
> ~Dan Rabosky
>
> _______________________________________________
> R-sig-ecology mailing list
> R-sig-ecology at r-project.org
> https://stat.ethz.ch/mailman/listinfo/r-sig-ecology
>
More information about the R-sig-ecology
mailing list