[R] R and large RAM

Joerg Kindermann Joerg.Kindermann at gmd.de
Fri Jul 7 16:07:25 CEST 2000


>>>>> " " == Dipl -Stat Detlef Steuer <steuer at statistik.uni-dortmund.de> writes:

     > Hi!  On 07-Jul-2000 Gerhard Paass wrote:
    >> I am planing to buy several machines with 2 GB RAM.  Is R able to
    >> use this much memory?

     > Yes! We are happily running R on two Enterprise Servers with 2GB
     > Ram.  R was behaving well if asked to comsume a lot of this RAM to
     > work on large datasets.

This does not mean that R actually uses that much space. From exprimenting
with the nsize and vsize command line parameters we know that the current
(R-1.1.0) limits are somewhere at nsize=19.5M and vsize=990M. On a SUN
running Solaris2.7 this will require about 1400M of memory. 

You can't  request larger sizes of memory for R-internal reasons. Our
question is whether this is likely to be changed in the near future.

best regards

Joerg Kindermann

--  Dr. Joerg Kindermann                        Knowledge Discovery Team
    GMD   - German National Research Center for Information Technology -
    	       phone: +49 02241 142437 fax: +49 02241 142342
			   http://ais.gmd.de/KD/

-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._



More information about the R-help mailing list