[R] Absolute ceiling on R's memory usage = 4 gigabytes?

Paul Gilbert pgilbert at bank-banque-canada.ca
Fri Jul 2 16:39:37 CEST 2004


It looks like you have R compiled as a 32 bit application, and you will 
need to compile it as a 64 bit application if you want to address more 
than 4G memory. I am not familiar with the sgi irix machine, but you can 
do this on many workstations that have processors with a 64 bit 
architecture and an OS that supports it.  The R-admin notes have some 
hints about how to do this for various platforms.

Paul Gilbert

Kort, Eric wrote:

>Yes, we are using the HGU-133plus2 chips with 50,000+ probes, and I suppose that the memory requirements increase geometrically as the chip size increases.
> 
>Thanks for your email...I can let you know if we have any success if you are interested for future reference.
> 
>-Eric
>
>	-----Original Message----- 
>	From: Tae-Hoon Chung [mailto:thchung at tgen.org] 
>	Sent: Thu 7/1/2004 7:52 PM 
>	To: Kort, Eric 
>	Cc: r-help at stat.math.ethz.ch 
>	Subject: Re: [R] Absolute ceiling on R's memory usage = 4 gigabytes?
>	
>	
>
>	Hi, Eric.
>	It seems a little bit puzzling to me. Which Affymetrix chip do you use?
>	The reason I'm asking this is that yesterday I was able to normalize
>	150 HU-133A CEL files (containing 22283 probes) using R 1.9.1 in Mac OS
>	X 10.3.3 with 1.5 GB memory. If your chip has more probes than this,
>	then it must be understandable ...
>	
>	On Jul 1, 2004, at 2:59 PM, Kort, Eric wrote:
>	
>	> Hello.  By way of background, I am running out of memory when
>	> attempting to normalize the data from 160 affymetrix microarrays using
>	> justRMA (from the affy package).  This is despite making 6 gigabytes
>	> of swap space available on our sgi irix machine (which has 2 gigabytes
>	> of ram).  I have seen in various discussions statements such as "you
>	> will need at least 6 gigabytes of memory to normalize that many
>	> chips", but my question is this:
>	>
>	> I cannot set the memory limits of R (1.9.1) higher than 4 gigabytes as
>	> attempting to do so results in this message:
>	>
>	> WARNING: --max-vsize=4098M=4098`M': too large and ignored
>	>
>	> I experience this both on my windows box (on which I cannot allocate
>	> more than 4 gigabytes of swap space anyway), and on an the above
>	> mentioned sgi irix machine (on which I can).  In view of that, I do
>	> not see what good it does to make > 4 gigabytes of ram+swap space
>	> available.  Does this mean 4 gigabytes is the absolute upper limit of
>	> R's memory usage...or perhaps 8 gigabytes since you can set both the
>	> stack and the heap size to 4 gigabytes?
>	>
>	> Thanks,
>	> Eric
>	>
>	>
>	> This email message, including any attachments, is for the
>	> so...{{dropped}}
>	>
>	> ______________________________________________
>	> R-help at stat.math.ethz.ch mailing list
>	> https://www.stat.math.ethz.ch/mailman/listinfo/r-help
>	> PLEASE do read the posting guide!
>	> http://www.R-project.org/posting-guide.html
>	>
>	>
>	Tae-Hoon Chung, Ph.D
>	
>	Post-doctoral Research Fellow
>	Molecular Diagnostics and Target Validation Division
>	Translational Genomics Research Institute
>	1275 W Washington St, Tempe AZ 85281 USA
>	Phone: 602-343-8724
>	
>	
>
>
>This email message, including any attachments, is for the so...{{dropped}}
>
>______________________________________________
>R-help at stat.math.ethz.ch mailing list
>https://www.stat.math.ethz.ch/mailman/listinfo/r-help
>PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
>
>  
>




More information about the R-help mailing list