[BioC] Memory Problems
James MacDonald
jmacdon at med.umich.edu
Thu Dec 18 14:45:25 MET 2003
You don't mention what version of R you are using, nor your OS. However,
since you are having memory re-allocation problems, I have to assume you
are on win32 and that you are using R < 1.9.0 or 1.8.1-patched.
My understanding of memory issues in win32 with earlier versions of R
is that the memory allocation process is sort of one-way, so you can run
out of memory even if you are running the garbage collector to reclaim
it. I am sure this is not technically correct, and if BDR were
subscribed to this list he would correct me, but the effect remains; if
you allocate too much memory to big objects you will eventually run out
even if you try to reclaim it.
The patched version of R and R-1.9.0 have a different malloc that is
supposed to be better at reclaiming memory, so you might go to Duncan
Murdoch's website and get one or the other.
http://www.stats.uwo.ca/faculty/murdoch/software/r-devel/
Best,
Jim
James W. MacDonald
Affymetrix and cDNA Microarray Core
University of Michigan Cancer Center
1500 E. Medical Center Drive
7410 CCGC
Ann Arbor MI 48109
734-647-5623
>>> "david neil hayes" <davidneilhayes at hotmail.com> 12/17/03 04:15PM
>>>
Thanks to Dr. Huber for the response to my earlier question. Another
matchprobes question that may have more general interest in terms of
memory
usage (which in my experience has been a bigger problem than processing
speed)
I have a folder of files, each file representing one affybatch object
(which
is a single array). I am using the "load" command to read these files
in
batches of 10, then I perform a "combine" function. I save the
results to
a file, then move on to the next batch of 10.
I find that my page file usage continues to increase, even though I
have
"removed" the original 10 affybatch objects and all references to them.
As
you might expect, I quickly exhaust my RAM. I have been unable to
solve
this on my own. In talking with some of the Bioconductor staff, I
understand this may relate to the environments used in the affy
package.
To reduce my memory usage I have tried:
affybatch <- 0
gc()
rm(affybatch)
putting the entire batching process in a separate function function
from
which I exit before
moving to the next batch
_________________________________________________________________
It's our best dial-up Internet access offer: 6 months @$9.95/month.
Get it
_______________________________________________
Bioconductor mailing list
Bioconductor at stat.math.ethz.ch
https://www.stat.math.ethz.ch/mailman/listinfo/bioconductor
More information about the Bioconductor
mailing list