[BioC] running out of memory
Steve Lianoglou
mailinglist.honeypot at gmail.com
Wed Jan 20 17:02:46 CET 2010
Hi,
2010/1/20 Javier Pérez Florido <jpflorido at gmail.com>:
> Dear list,
> I'm trying to normalize using threestep several CEL files (around 100
> CEL files). I got the following error:
>
> Error: cannot allocate vector of size 6.5 Mb
>
> I've been reading the suggestions about this error in the mailing list,
> but I couldn't fix it. My system (Win XP Prof, 32 bits) has 3 GB RAM and
> I did the following:
>
> * I used the command line --max-mem-size=3071M in the R shortcut
> * I changed the boot.ini file to allow up to 3GB
> o [boot loader]
> timeout=30
> default=multi(0)disk(0)rdisk(0)partition(1)\WINDOWS
> [operating systems]
> multi(0)disk(0)rdisk(0)partition(1)\WINDOWS="Microsoft
> Windows XP Professional" /3GB /noexecute=optin /fastdetect
> * I try to remove not used variables using the rm() command and the
> garbage collector gc() within the source code.
>
> But, even with those changes, I'm still running out of memory. It is
> strange, because, having a look at the tasks administrator, R doesn't
> use more than 1600 MB and the whole system never goes further than 2GB
I can't really comment on tweaking memory settings on windows systems,
but if all you're doing is trying to normalize a boat-load of affy
arrays together, I understand that the aroma.affymetrix package can do
so while keeping memory requirements down. Perhaps you might consider
looking into it until someone can give you better advice:
http://groups.google.com/group/aroma-affymetrix/web/overview
HTH,
-steve
--
Steve Lianoglou
Graduate Student: Computational Systems Biology
| Memorial Sloan-Kettering Cancer Center
| Weill Medical College of Cornell University
Contact Info: http://cbio.mskcc.org/~lianos/contact
More information about the Bioconductor
mailing list