[R] Memory problem ... Again
Tae-Hoon Chung
thchung at tgen.org
Mon Jan 3 22:29:03 CET 2005
Happy new year to all;
A few days ago, I posted similar problem. At that time, I found out that our
R program had been 32-bit compiled, not 64-bit compiled. So the R program
has been re-installed in 64-bit and run the same job, reading in 150
Affymetrix U133A v2 CEL files and perform dChip processing. However, the
memory problem happened again. Since the amount of physical memory is 64GB,
I think it should not be a problem. Is there anyway we can configure memory
usage so that all physical memory can be utilized?
Our system is like this:
System type: IBM AIX Symmetric Multiprocessing (SMP)
OS version: SuSe 8 SP3a
CPU: 8
Memory: 64GB
The codes are as follows:
> Data <- ReadAffy(filenames = paste(HOME, "CelData/", fname, sep=""))
> eset <- expresso(Data, normalize.method="invariantset", bg.correct=FALSE, pmc\
orrect.method="pmonly", summary.method="liwong")
normalization: invariantset
PM/MM correction : pmonly
expression values: liwong
normalizing...Error: cannot allocate vector of size 594075 Kb
> gc()
used (Mb) gc trigger (Mb)
Ncells 797971 21.4 1710298 45.7
Vcells 76716794 585.4 305954055 2334.3
...
> mem.limits()
nsize vsize
NA NA
> object.size(Data)
[1] 608355664
> memory.profile()
NILSXP SYMSXP LISTSXP CLOSXP ENVSXP PROMSXP LANGSXP
1 30484 372383 4845 420 180 127274
SPECIALSXP BUILTINSXP CHARSXP LGLSXP INTSXP
203 1168 111430 5296 0 0 44650
REALSXP CPLXSXP STRSXP DOTSXP ANYSXP VECSXP EXPRSXP
13382 9 60170 0 0 26003 0
BCODESXP EXTPTRSXP WEAKREFSXP
0 106 0
More information about the R-help
mailing list