[R] vsize and nsize

Tony Long tdlong at uci.edu
Sat May 15 18:01:41 CEST 1999


Brian (and group):

	Whoops.  I wrote the message meaning to look at the version number
and fill it, but then somehow forgot.  I am running version 0.63.2 (and
with the compile-time settings below was not getting and error message) --
so an upgrade is in order.  The other advice (regarding compile-time
settings) is also very useful.  Much appreciated.

	I agree that R is not "designed" for large calculations.  On the
other hand it is nice to have one statistical package to use for all
calculations.  I mostly deal with Drosophila and DNA, as such I  am an
amateur statistician and would like to avoid learning a number of
statistical languages.  With a big Linux box, I can often power through
things.  In the past I have found it frustrating to do a bunch of stuff in
SAS only to hit a snag and then have to write (time consuming) "C" code to
finish the job.  So although not designed for large calculations, R is so
flexible and logical that it is very attractive to use it for such...I
think that many other people may be similarly attracted to the language.  I
would appreciate dialog, as I think that much more may have been
accomplished in R than was intended by the founders.



>On Fri, 14 May 1999, Tony Long wrote:
>
>> I am running R version ??? under Redhat 5.2.  It seems as though the
>> --nsize object has no effct on the size of the allocated Ncells as
>> determined using gc().  Yes, I have that much data....
>>
>> That is if I envoke R with
>>
>> R --vsize 100 --nsize 5000000
>>
>> then type
>>
>> gc()
>>
>> I get
>>
>> 	free		total
>> Ncells	92202		200000
>> Vcells	12928414	13107200
>
>Well, we do need to know what ??? is (is it so hard?: it appears on the
>start-up banner, or use Version()). There are limits to both nsize and
>vsize, but they are currently 20000000 and 2048M.  If you try to specify
>more than the limit, you will get a warning about the limit being ignored.
>(Have you overlooked that as well as the start-up banner?) Given that a
>recent version of R will object to --vsize 100 (it should be 100M), I
>surmise yours is not 0.64.1 and suggest you upgrade.
>
>The limits are OS-specific compile-time settings: look in src/unix/system.c
>to change them for Linux. The real limit on vsize is LONG_MAX, slightly
>greater on a 32-bit machine, and I guess the limit on nsize is chosen to be
>comparable (it is harder to need lots of ncells).  However, as Ross has
>said here, R is not designed for very large computations so the current
>limits may be rather academic.
>
>--
>Brian D. Ripley,                  ripley at stats.ox.ac.uk
>Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
>University of Oxford,             Tel:  +44 1865 272861 (self)
>1 South Parks Road,                     +44 1865 272860 (secr)
>Oxford OX1 3TG, UK                Fax:  +44 1865 272595


Tony Long
Ecology and Evolutionary Biology
Steinhaus Hall
University of California at Irvine
Irvine, CA
92697-2525

Tel:  (949) 824-2562   (office)          ****NOTE NEW AREA CODE****
Tel:  (949) 824-5994   (lab)              ****NOTE NEW AREA CODE****
Fax: (949) 824-2181                        ****NOTE NEW AREA CODE****

email:  tdlong at uci.edu 


-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._



More information about the R-help mailing list