[R] detecting the source of memory consumption (example provided)
Juliet Hannah
juliet.hannah at gmail.com
Thu Jan 22 22:35:56 CET 2009
I have read in a file (call it myData). The actual file is about
3000x30,000 columns and object.size() says myData takes:
> 737910472/(1024^2)
[1] 703.7263
Unfortunately, my program ends up using 40GB as indicated by maxvmem
on Unix, which causes my department's cluster to stop working.
Perhaps, I have some copying going on that I cannot find. I have
created an example below that mimics my program. Could someone help me
find my error? I am also confused about how to use Rprofmem to study
this problem. Thanks for your time.
Regards,
Juliet
#begin example
response <- rnorm(50);
x1 <- sample(c(1,2),50,replace=TRUE)
age <- sample(seq(20,80),50,replace=TRUE)
id <- rep(1:25,each=2)
var1 <- rnorm(50);
var2 <- rnorm(50);
var3 <- rnorm(50);
myData <- data.frame(response,x1,age,id,var1,var2,var3)
numVars <- ncol(myData)-4;
pvalues <- rep(-1,numVars);
names(pvalues) <- colnames(myData)[5:ncol(myData)];
library(yags)
for (Var_num in 1:numVars)
{
fit.yags <- yags(myData$response ~
myData$age+myData$x1*myData[,(Var_num+4)], id=myData$id,
family=gaussian,corstr="exchangeable",alphainit=0.05)
z.gee <- fit.yags at coefficients[5]/sqrt(fit.yags at robust.parmvar[5,5]);
pval <- 2 * pnorm(abs(z.gee), lower.tail = FALSE);
pvalues[Var_num] <- signif(pval,3);
}
More information about the R-help
mailing list