[R] Memory/data -last time I promise
Prof Brian D Ripley
ripley at stats.ox.ac.uk
Wed Jul 25 13:57:06 CEST 2001
On Wed, 25 Jul 2001, Agustin Lobo wrote:
> On Tue, 24 Jul 2001, Prof Brian Ripley wrote:
> > The main problem I see is that your machine seems unable to allocate more
> > than about 450Mb to R, and it has surprisingly little swap space. (This
> > 512Mb Linux machine has 1Gb of swap allocated, and happily allocates 800Mb
> > to R when needed.)
> Well, this rises an interesting point for me: are there advices on
> how to configure a particular system for best R performance with
> large datasets? I've looked into the R system guide and could not find
> anything (that document is a bit obscure for me, must recognize).
> Do you get the 800 Mb by starting R with a particular option?
No, just the standard options. Basically, under Unix/Linux
1) Make sure the ulimit/limit settings are suitable (look up your shell
2) Make sure you do have ample swap space configured: disc space is
really cheap, and with the current non-moving-objects garbage collector,
currently unused large objects can be successfully swapped out. (That was
not true before 1.2.0.)
3) Start R without any options.
So there is no advice, as nothing special is needed.
On *Windows* there is an equivalent of ulimit/limit set and you are likely
to be less successful in running large R workspaces. In some far as I
understand it, this applies to the classic Mac port too.
Brian D. Ripley, ripley at stats.ox.ac.uk
Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel: +44 1865 272861 (self)
1 South Parks Road, +44 1865 272860 (secr)
Oxford OX1 3TG, UK Fax: +44 1865 272595
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch
More information about the R-help