[R] R memory size increases
Prof Brian Ripley
ripley at stats.ox.ac.uk
Sun Jun 25 15:53:37 CEST 2006
On Sun, 25 Jun 2006, Daniel Gatti wrote:
> O/S : Solaris 9
> R version : 2.2.1
>
> I was getting out of memory errors from R when running a large job, so
> I've switched to a larger machine with 40G shared memory. I issue the
> following command when starting R to increase memory available to R:
>
> R --save --min-vsize=4G --min-nsize=4G
>
> When reading in a file, R responds with "could not allocate vector of
> size 146Kb." So I'm clearly using the command incorrectly. Does anyone
> know how to use this command correctly?
Yes, not at all. See ?Memory and ?"Memory-limits". The first says
R has a variable-sized workspace (from version 1.2.0). There is
now much less need to set memory options than previously, and most
users will never need to set these. They are provided both as a
way to control the overall memory usage (which can also be done by
operating-system facilities such as 'limit' on Unix), and since
setting larger values of the minima will make R slightly more
efficient on large tasks.
Do you have a 64-bit version of R? (If not, build one, as you will need
one to make use of large amounts of memory.)
The units for nsize are not bytes but a number of cells (28 bytes on a
32-bit system, 56 on a 64-bit system). You don't need that many cells
(and you will not get it anyway). If (as I suspect) you have a 32-bit
version of R, the limit on vsize will not help either since address-space
limits will intervene first.
See ?gc and ?gcinfo for ways to track actual memory usage.
--
Brian D. Ripley, ripley at stats.ox.ac.uk
Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel: +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UK Fax: +44 1865 272595
More information about the R-help
mailing list