[BioC] Memory limit (vector size) o linux 64bit
Ivan Porro
pivan at dist.unige.it
Fri Feb 9 12:56:06 CET 2007
Hi all,
I'm running a script that try to normalise 448 HGU133A Affymetrix arrays, and I
have "The Error" during the ReadAffy():
Error: cannot allocate vector of size 1770343 Kb
I know about R and OSs adressing limitations, so (according to several posts on
the mailing list) I'm doing that on a 64bit server:
x86_64 GNU/Linux (2x AMD Opteron 275)
R 2.3.1 compiled from source
MemTotal: 7819808 kB
VmallocTotal: 34359738367 kB
that is, 8GB RAM (the difference is probably kept by a on board video) and up to
34TB swap.
I know from R FAQ that "There are also limits on individual objects. On all
versions of R, the maximum length (number of elements) of a vector is 2^31 - 1
~ 2*10^9"
But: 2.147.483.648 = 2^31 is bigger than 1.770.343.000 bytes (my vector size)
I'm near (above) R physical limitations?
I use batch<-ReadAffy() then I should normalize it with gcrma(), invariantset
thank you in advance,
Ivan
--
http://www.bio.dist.unige.it
voice: +39 010 3532789
fax: +39 010 3532948
More information about the Bioconductor
mailing list