[R] R vs S-PLUS with regard to memory usage

Anantha Prasad/NE/USDAFS aprasad at fs.fed.us
Mon Oct 2 22:42:23 CEST 2000

Worked like magic thanks (I thought more I can allocate the better since my
application does not know before hand what the datasize is going to be)....
however, the step() function ran out of heap memory even after I allocated
15 MB ..... so, it looks like it is a no-win situation.....If I increase
heap too much to satisfy step(), there is disk swapping (if not on my
present machine, on others which do not have the luxury of 192 MB); if I
don't it runs out of memory.... any suggestions other than wait for 1.2? If
there are more people like me, it's some incentive to get 1.2 out soon!
Thanks much.

Mr. Anantha Prasad, Ecologist/GIS Specialist
USDA Forest Service, 359 Main Rd.
Delaware OHIO 43015    USA
Ph: 740-368-0103  Email: aprasad at fs.fed.us
Web: http://www.fs.fed.us/ne/delaware/index.html
Don't Miss Climate Change Tree Atlas at:

                    Peter Dalgaard                                                     
                    BSA                     To:     "Anantha Prasad/NE/USDAFS"         
                    <p.dalgaard at bios        <aprasad at fs.fed.us>                        
                    tat.ku.dk>              cc:     r-help at stat.math.ethz.ch           
                    Sent by:                Subject:     Re: [R] R vs S-PLUS with      
                    pd at blueberry.kub        regard to memory usage                     
                    10/02/00 04:20                                                     

"Anantha Prasad/NE/USDAFS" <aprasad at fs.fed.us> writes:

> I am trying to translate code from S-PLUS to R and R really struggles!
> After starting R with the foll.
> R --vsize 50M --nsize 6M --no-restore
> on a 400 MHz Pentium with 192 MB of memory running Linux (RH 6.2),
> I run a function that essentially picks up an external dataset with 2121
> rows
> and 30 columns and builds a lm() object and also runs step() ... the step
> takes forever to run...(takes very little time in S-PLUS).

Notice that the --nsize takes the number of *nodes* as the value. Each
is 20 bytes, so you're allocating a 170MB chunk there. With various
other memory eaters active, that could easily push a 192MB machine
into thrashing.

The upcoming 1.2 version will be much better at handling memory, but
for now maybe reduce the nsize a bit? The vsize looks a bit hefty as
well given that the data should take up on the order of half a MB.

   O__  ---- Peter Dalgaard             Blegdamsvej 3
  c/ /'_ --- Dept. of Biostatistics     2200 Cph. N
 (*) \(*) -- University of Copenhagen   Denmark      Ph: (+45) 35327918
~~~~~~~~~~ - (p.dalgaard at biostat.ku.dk)             FAX: (+45) 35327907

r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch

More information about the R-help mailing list