[R] About Memory size

Uwe Ligges ligges at statistik.uni-dortmund.de
Sat Jun 23 17:00:22 CEST 2007



Ferdouse Begum wrote:
> Hi,
> I am trying to analyse cancer data set (affymetrix) by
> using bioconductor packages. I have total 14 data set.
> Total size of the data set is 432MB. 

Do you mean it comsumes 432MB to have the data in R or in some format on 
the harddisc?
Do you need to work on the whole datasets at once?
Have you read the manuals and FAQs (there are sections about memory!!!)?


> Now I am trying
> to analyse these data sets in my PC with RAM 512. But


If you need 432MB just to have the data available in R, then you should 
have *at least* 1GB of RAM in your machine, and for certain function, 
you might need much more.

Hence the advise is to rethink how to reduce the problem or to buy 2GB 
of RAM for your machine (which is advisable in any case, because RAM is 
cheap and thinking hurts). We have upgraded all of our computer labs to 
at least 1GB these days.

Uwe Ligges


> if I want to get MAplot of my data set, I am getting
> the messege
> (
>> MAplot(ptc.rawData)
> Error: cannot allocate vector of size 73.8 Mb
> In addition: Warning messages:
> 1: Reached total allocation of 503Mb: see
> help(memory.size) 
> 2: Reached total allocation of 503Mb: see
> help(memory.size) 
> 3: Reached total allocation of 503Mb: see
> help(memory.size) 
> 4: Reached total allocation of 503Mb: see
> help(memory.size))
> 
> Now how can I get rid of this problem? 
> pls help.

> With thanks
> 
> Ferdouse
> 
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.



More information about the R-help mailing list