[R] memory problem
Peter Dalgaard
p.dalgaard at biostat.ku.dk
Tue Mar 9 00:13:13 CET 2004
"Joshi, Nina (NIH/NCI)" <joshini at mail.nih.gov> writes:
> I am trying to upload into R 143 Affymetrix chips onto using R on the NIH
> Nimbus server. I can load 10 chips without a problem, however, when I try
> to load 143 I receive a error message: cannot create a vector of 523263 KB.
> I have expanded the memory of R as follows: R --min-vsize=10M
> --max-vsize=2500M --min-nsize=10M -max-nsize=50M (as specified in help in
> R). After running this command the memory in R is as follows:
>
>
>
> Used (Mb) gc trigger (Mb) limit
> (Mb)
>
> Ncells 513502 13.8 10485760 280.0 1400
>
> Vcells 142525 1.1 162625696 1240.8 2500
>
>
>
> However, I am still getting the error cannot create a vector of 523263 KB.
> Any suggestions/ideas?
Well, it basically means that you're running out of memory. The 523263
KB is just the last request that R tries to honour when it fails, but
it is in fact a whole half gigabyte. I forget what the memory usage per
Affy chip is (and it depends on the chip, I suppose), but how much did
the 10 chips take? Do you have space for 14 times that amount?
It is not unlikely that you're running into the limits of the 32bit
address space, in which case there is little to do but to move to a
64bit platform (with plenty of swap space).
(Note that you may get replies from more specifically experienced
people on the Bioconductor lists.)
--
O__ ---- Peter Dalgaard Blegdamsvej 3
c/ /'_ --- Dept. of Biostatistics 2200 Cph. N
(*) \(*) -- University of Copenhagen Denmark Ph: (+45) 35327918
~~~~~~~~~~ - (p.dalgaard at biostat.ku.dk) FAX: (+45) 35327907
More information about the R-help
mailing list