[R] R and memory
Prof Brian Ripley
ripley at stats.ox.ac.uk
Wed Jan 10 18:30:01 CET 2001
On Wed, 10 Jan 2001, Meriema Belaidouni wrote:
> I have some problems to read large data file with R.
We need to know more to be able to help with that.
> can someone tell me why running
> R --visze=30M --nsize=2000k
> uses in fact 63M?
Let me try. That allocates (modulo a typo) 30Mb of heap and 2 million
cons cells. I will assume you are using a 32-bit system and a version of
R prior to 1.2.0. Then a cons cell is 20 bytes, and so the total space is
38.2Mb for those. So (and I tried it) the workspace is 68.2Mb. However,
depending on your system, not all of that may appear under say top, and in
any case the R process needs another 4Mb or so for code.
The current version of R may well behave differently, and we do suggest
that you do not use those flags any more.
Brian D. Ripley, ripley at stats.ox.ac.uk
Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel: +44 1865 272861 (self)
1 South Parks Road, +44 1865 272860 (secr)
Oxford OX1 3TG, UK Fax: +44 1865 272595
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch
More information about the R-help