[BioC] Memory Problems

david neil hayes davidneilhayes at hotmail.com
Thu Dec 18 15:47:04 MET 2003


Thanks for the insight, I will try this.  You are correct in that I am using 
a windows machine, R1.8.1.

Would this problem (and many related problems) be substantially improved if 
I switched to a Linux system?

Thanks,
Neil


>From: "James MacDonald" <jmacdon at med.umich.edu>
>To: <davidneilhayes at hotmail.com>,<Bioconductor at stat.math.ethz.ch>
>Subject: Re: [BioC] Memory Problems
>Date: Thu, 18 Dec 2003 08:45:25 -0500
>
>You don't mention what version of R you are using, nor your OS. However,
>since you are having memory re-allocation problems, I have to assume you
>are on win32 and that you are using R < 1.9.0 or 1.8.1-patched.
>
>My understanding of memory issues in win32 with earlier versions of R
>is that the memory allocation process is sort of one-way, so you can run
>out of memory even if you are running the garbage collector to reclaim
>it. I am sure this is not technically correct, and if BDR were
>subscribed to this list he would correct me, but the effect remains; if
>you allocate too much memory to big objects you will eventually run out
>even if you try to reclaim it.
>
>The patched version of R and R-1.9.0 have a different malloc that is
>supposed to be better at reclaiming memory, so you might go to Duncan
>Murdoch's website and get one or the other.
>
>http://www.stats.uwo.ca/faculty/murdoch/software/r-devel/
>
>Best,
>
>Jim
>
>
>
>James W. MacDonald
>Affymetrix and cDNA Microarray Core
>University of Michigan Cancer Center
>1500 E. Medical Center Drive
>7410 CCGC
>Ann Arbor MI 48109
>734-647-5623
>
> >>> "david neil hayes" <davidneilhayes at hotmail.com> 12/17/03 04:15PM
> >>>
>Thanks to Dr. Huber for the response to my earlier question.  Another
>matchprobes question that may have more general interest in terms of
>memory
>usage (which in my experience has been a bigger problem than processing
>
>speed)
>
>I have a folder of files, each file representing one affybatch object
>(which
>is a single array).  I am using the "load" command to read these files
>in
>batches of 10, then I perform a "combine" function.   I save the
>results to
>a file, then move on to the next batch of 10.
>
>I find that my page file usage continues to increase, even though I
>have
>"removed" the original 10 affybatch objects and all references to them.
>  As
>you might expect, I quickly exhaust my RAM.  I have been unable to
>solve
>this on my own.  In talking with some of the Bioconductor staff, I
>understand this may relate to the environments used in the affy
>package.
>
>To reduce my memory usage I have tried:
>   affybatch <- 0
>   gc()
>   rm(affybatch)
>   putting the entire batching process in a separate function function
>from
>which I exit before
>      moving to the next batch
>
>_________________________________________________________________
>It's our best dial-up Internet access offer: 6 months @$9.95/month.
>Get it
>
>_______________________________________________
>Bioconductor mailing list
>Bioconductor at stat.math.ethz.ch
>https://www.stat.math.ethz.ch/mailman/listinfo/bioconductor

_________________________________________________________________
Check your PC for viruses with the FREE McAfee online computer scan.



More information about the Bioconductor mailing list