[Rd] Q: R 2.2.1: Memory Management Issues?
Prof Brian Ripley
ripley at stats.ox.ac.uk
Fri Jan 6 09:44:28 CET 2006
On Thu, 5 Jan 2006, Simon Urbanek wrote:
> Karen,
>
> On Jan 5, 2006, at 5:18 PM, <Karen.Green at sanofi-aventis.com>
> <Karen.Green at sanofi-aventis.com> wrote:
>
>> I am trying to run a R script which makes use of the MCLUST package.
>> The script can successfully read in the approximately 17000 data
>> points ok, but then throws an error:
>> --------------------------------------------------------
>> Error: cannot allocate vector of size 1115070Kb
>
> This is 1.1GB of RAM to allocate alone for one vector(!). As you
> stated yourself the total upper limit is 2GB, so you cannot even fit
> two of those in memory anyway - not much you can do with it even if
> it is allocated.
Just in case people missed this (Simon as a MacOS user has no reason to
know this), the Windows limit is in fact 3Gb if you tell your OS to allow
it. (How is in the quoted rw-FAQ, Q2.9, and from 2.2.1 R will
automatically notice this whereas earlier versions needed to be told.)
However, there is another problem with a 32-bit OS: you can only fit 2
1.1Gb objects in a 3Gb address space if they are in specific positions,
and fragmentation is often a big problem.
I believe a 64-bit OS with 4Gb of RAM would handle such problems much
more comfortably. The alternative is to find (or write) more efficient
mixture-fitting software than mclust.
--
Brian D. Ripley, ripley at stats.ox.ac.uk
Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel: +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UK Fax: +44 1865 272595
More information about the R-devel
mailing list