[R] Memory problem ... Again
Peter Dalgaard
p.dalgaard at biostat.ku.dk
Mon Jan 3 23:39:46 CET 2005
Tae-Hoon Chung <thchung at tgen.org> writes:
> Happy new year to all;
>
> A few days ago, I posted similar problem. At that time, I found out that our
> R program had been 32-bit compiled, not 64-bit compiled. So the R program
> has been re-installed in 64-bit and run the same job, reading in 150
> Affymetrix U133A v2 CEL files and perform dChip processing. However, the
> memory problem happened again. Since the amount of physical memory is 64GB,
> I think it should not be a problem. Is there anyway we can configure memory
> usage so that all physical memory can be utilized?
>
> Our system is like this:
> System type: IBM AIX Symmetric Multiprocessing (SMP)
> OS version: SuSe 8 SP3a
> CPU: 8
> Memory: 64GB
.....
> expression values: liwong
> normalizing...Error: cannot allocate vector of size 594075 Kb
> > gc()
> used (Mb) gc trigger (Mb)
> Ncells 797971 21.4 1710298 45.7
As Brian Ripley told you, 64-bit builds of R has 56byte Ncells, so if
yours was one, you should have
> 797971*56/1024/1024
[1] 42.61625
i.e. 42.6Mb used for your Ncells, and it seems that you don't....
--
O__ ---- Peter Dalgaard Blegdamsvej 3
c/ /'_ --- Dept. of Biostatistics 2200 Cph. N
(*) \(*) -- University of Copenhagen Denmark Ph: (+45) 35327918
~~~~~~~~~~ - (p.dalgaard at biostat.ku.dk) FAX: (+45) 35327907
More information about the R-help
mailing list