[R] Memory problem ... Again => False Alarm !!!
Tae-Hoon Chung
thchung at tgen.org
Mon Jan 3 23:46:15 CET 2005
Thanks Peter and Andy;
I just found it was not due to memory problem. It was false alarm ...
64-bit compiled program works fine!
On 1/3/05 3:39 PM, "Peter Dalgaard" <p.dalgaard at biostat.ku.dk> wrote:
> Tae-Hoon Chung <thchung at tgen.org> writes:
>
>> Happy new year to all;
>>
>> A few days ago, I posted similar problem. At that time, I found out that our
>> R program had been 32-bit compiled, not 64-bit compiled. So the R program
>> has been re-installed in 64-bit and run the same job, reading in 150
>> Affymetrix U133A v2 CEL files and perform dChip processing. However, the
>> memory problem happened again. Since the amount of physical memory is 64GB,
>> I think it should not be a problem. Is there anyway we can configure memory
>> usage so that all physical memory can be utilized?
>>
>> Our system is like this:
>> System type: IBM AIX Symmetric Multiprocessing (SMP)
>> OS version: SuSe 8 SP3a
>> CPU: 8
>> Memory: 64GB
> .....
>> expression values: liwong
>> normalizing...Error: cannot allocate vector of size 594075 Kb
>>> gc()
>> used (Mb) gc trigger (Mb)
>> Ncells 797971 21.4 1710298 45.7
>
> As Brian Ripley told you, 64-bit builds of R has 56byte Ncells, so if
> yours was one, you should have
>
>> 797971*56/1024/1024
> [1] 42.61625
>
> i.e. 42.6Mb used for your Ncells, and it seems that you don't....
More information about the R-help
mailing list