[R] R vs S-PLUS with regard to memory usage

Anantha Prasad/NE/USDAFS aprasad at fs.fed.us
Mon Oct 2 22:59:11 CEST 2000

OK .... my mistake ...this vsize and nsize confusion....I re-read the
memory part and Prof. Ripley's reply -  here is the lession I learnt:
Increase vsize but NOT nsize and you'll not have problem  with disk
swapping. Yes, I now have renewed faith that I can indeed use R - although
I do feel 1.2 would be better if this "allocation confusion" is taken care
THanks again.

Mr. Anantha Prasad, Ecologist/GIS Specialist
USDA Forest Service, 359 Main Rd.
Delaware OHIO 43015    USA
Ph: 740-368-0103  Email: aprasad at fs.fed.us
Web: http://www.fs.fed.us/ne/delaware/index.html
Don't Miss Climate Change Tree Atlas at:

                    Peter Dalgaard                                                     
                    BSA                     To:     "Anantha Prasad/NE/USDAFS"         
                    <p.dalgaard at bios        <aprasad at fs.fed.us>                        
                    tat.ku.dk>              cc:     r-help at stat.math.ethz.ch           
                    Sent by:                Subject:     Re: [R] R vs S-PLUS with      
                    pd at blueberry.kub        regard to memory usage                     
                    10/02/00 04:20                                                     

"Anantha Prasad/NE/USDAFS" <aprasad at fs.fed.us> writes:

> I am trying to translate code from S-PLUS to R and R really struggles!
> After starting R with the foll.
> R --vsize 50M --nsize 6M --no-restore
> on a 400 MHz Pentium with 192 MB of memory running Linux (RH 6.2),
> I run a function that essentially picks up an external dataset with 2121
> rows
> and 30 columns and builds a lm() object and also runs step() ... the step
> takes forever to run...(takes very little time in S-PLUS).

Notice that the --nsize takes the number of *nodes* as the value. Each
is 20 bytes, so you're allocating a 170MB chunk there. With various
other memory eaters active, that could easily push a 192MB machine
into thrashing.

The upcoming 1.2 version will be much better at handling memory, but
for now maybe reduce the nsize a bit? The vsize looks a bit hefty as
well given that the data should take up on the order of half a MB.

   O__  ---- Peter Dalgaard             Blegdamsvej 3
  c/ /'_ --- Dept. of Biostatistics     2200 Cph. N
 (*) \(*) -- University of Copenhagen   Denmark      Ph: (+45) 35327918
~~~~~~~~~~ - (p.dalgaard at biostat.ku.dk)             FAX: (+45) 35327907

r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch

More information about the R-help mailing list