[R] Memory woes
John D. Barnett
jbarnett at wi.mit.edu
Fri Jan 28 16:43:39 CET 2000
I'm having some problems with memory consumption under R. I've tried
increasing the appropriate memory values, but it keeps asking for more;
I've even upped the heap size to 600M, significantly eating into swap
(256M real, 500+M swap). So, performance slows to a crawl.
What I'm trying to do is run isoMDS on a 4000x4000 matrix.
My first question is, how much memory should this matrix occupy? Is it
~4000^2 * sizeof(double)?
I have an idea about what's going on, but I'm not sure; perhaps someone
could correct me if this interpretation is wrong. Since R uses
call-by-value, all data structures are first duplicated, and then handed
off to the function called. I was getting out of memory errors in
calling isoMDS; once I got around that, and waited to see what would
happen next (after about 5 minutes of swapping), I got an error saying
cmdscale not found. Since isoMDS begins:
isoMDS <- function(d, y=cmdscale(d, 2), maxit=50, trace=TRUE)
this means that the whole time I was waiting, was spent in copying d.
Is this correct?
Since the global environment is actually accessible from within
functions, could I modify this to use call-by-reference? Are there any
pitfalls I should watch out for?
In case this is relevant, here are my system particulars:
PII, 256M ram + ~500M swap
RedHat 6.1 Linux
Thanks in advance for your helpful replies!
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch
More information about the R-help