[R] Affymetrix data analysis

Sicotte, Hugues Ph.D. Sicotte.Hugues at mayo.edu
Fri Feb 2 14:20:46 CET 2007


I stand corrected, the rule was not 1/2. (I have 2 Gigs.. So the rule
was for my office's PC's).
Still, R doesn't always use all available memory on Windows, and one may
be able to set the options in order to get more.

Hugues
Ps. By the way Prof. Ripley, Thanks for all your efforts for R. 

-----Original Message-----
From: Prof Brian Ripley [mailto:ripley at stats.ox.ac.uk] 
Sent: Friday, February 02, 2007 5:37 AM
To: Sicotte, Hugues Ph.D.
Cc: Tristan Coram; R-help at stat.math.ethz.ch
Subject: RE: [R] Affymetrix data analysis

On Fri, 2 Feb 2007, Sicotte, Hugues   Ph.D. wrote:

> Of course, you would know best, so can you tell us if the help pages I
> pull using
>
> help(Memory)
>
> is wrong?
> That help page says (2nd paragraph)
>
> "(On Windows the --max-mem-size option sets the maximum memory
> allocation: it has a minimum allowed value of 16M. This is intended to
> catch attempts to allocate excessive amounts of memory which may cause
> other processes to run out of resources. The default is the smaller of
> the amount of physical RAM in the machine and 1024Mb. See also
> memory.limit.) "

It says nothing about 'half' does it?

Depending on your version of R and Windows, the default is 1Gb, 1.5Gb or

2.5Gb, and the rw-FAQ gives the whole truth. The current version of that

help page is different:

https://svn.r-project.org/R/trunk/src/library/base/man/Memory.Rd

it looks like in 2.4.1 it had not been updated yet.

>
>
> Hugues
>
> -----Original Message-----
> From: Prof Brian Ripley [mailto:ripley at stats.ox.ac.uk]
> Sent: Friday, February 02, 2007 3:05 AM
> To: Sicotte, Hugues Ph.D.
> Cc: Tristan Coram; R-help at stat.math.ethz.ch
> Subject: Re: [R] Affymetrix data analysis
>
> On Thu, 1 Feb 2007, Sicotte, Hugues   Ph.D. wrote:
>
>> Tristan,
>> I have a soft spot for problems analyzing microarrays with R..
>>
>> for the memory issue, there have been previous posts to this list..
>> But here is the answer I gave a few weeks ago.
>> If you need more memory, you have to move to linux or recompile R for
>> windows yourself..
>> .. But you'll still need a computer with more memory.
>> The long term solution, which we are implementing, is to rewrite the
>> normalization code so it doesn't
>> Need to load all those arrays at once.
>>
>> -- cut previous part of message--
>> The defaults in R is to play nice and limit your allocation to half
>> the available RAM. Make sure you have a lot of disk swap space (at
> least
>> 1G with 2G of RAM) and you can set your memory limit to 2G for R.
>
> That just isn't true (R uses as much of the RAM as is reasonable, all
> for
> up to 1.5Gb installed).  Please consult the rw-FAQ for the whole
truth.
>
>> See help(memory.size)  and use the memory.limit function
>
> [Please follow the advice you quote.]
>
>> Hugues
>>
>>
>> P.s. Someone let me use their 16Gig of RAM linux
>> And I was able to run R-64 bits with "top" showing 6Gigs of RAM
>> allocated (with suitable --max-mem-size command line parameters at
>> startup for R).
>
> There is no such 'command' for R under Linux.
>
>
>

-- 
Brian D. Ripley,                  ripley at stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford,             Tel:  +44 1865 272861 (self)
1 South Parks Road,                     +44 1865 272866 (PA)
Oxford OX1 3TG, UK                Fax:  +44 1865 272595



More information about the R-help mailing list