[R] Absolute ceiling on R's memory usage = 4 gigabytes?

Kort, Eric Eric.Kort at vai.org
Fri Jul 2 00:14:49 CEST 2004



>From: Liaw, Andy [mailto:andy_liaw at merck.com]
>
>Did you compile R as 64-bit executable on the Irix?  If not, R will be
>subjected to the 4GB limit of 32-bit systems.
>

No...

>Search the archive for `Opteron' and you'll see that the limit is not 4GB,
>for 64-bit executables.
>
>Andy

Excellent.  I will recompile and try again.

Thanks,
Eric

>> From:  Kort, Eric
>> 
>> Hello.  By way of background, I am running out of memory when 
>> attempting to normalize the data from 160 affymetrix 
>> microarrays using justRMA (from the affy package).  This is 
>> despite making 6 gigabytes of swap space available on our sgi 
>> irix machine (which has 2 gigabytes of ram).  I have seen in 
>> various discussions statements such as "you will need at 
>> least 6 gigabytes of memory to normalize that many chips", 
>> but my question is this:
>> 
>> I cannot set the memory limits of R (1.9.1) higher than 4 
>> gigabytes as attempting to do so results in this message:
>> 
>> WARNING: --max-vsize=4098M=4098`M': too large and ignored
>> 
>> I experience this both on my windows box (on which I cannot 
>> allocate more than 4 gigabytes of swap space anyway), and on 
>> an the above mentioned sgi irix machine (on which I can).  In 
>> view of that, I do not see what good it does to make > 4 
>> gigabytes of ram+swap space available.  Does this mean 4 
>> gigabytes is the absolute upper limit of R's memory 
>> usage...or perhaps 8 gigabytes since you can set both the 
>> stack and the heap size to 4 gigabytes?
>> 
>> Thanks,
>> Eric
>> 
>> 
This email message, including any attachments, is for the so...{{dropped}}




More information about the R-help mailing list