[R] Reasons to Use R

Bi-Info (http://members.home.nl/bi-info) bi-info at home.nl
Wed Apr 11 17:56:46 CEST 2007

I certainly have that idea too. SPSS functions in a way the same, 
although it specialises in PC applications. Memory addition to a PC is 
not a very expensive thing these days. On my first AT some extra memory 
cost 300 dollars or more. These days you get extra memory with a package 
of marshmellows or chocolate bars if you need it.
All computations on a computer are discrete steps in a way, but I've 
heard that SAS computations are split up in strictly divided steps. That 
also makes procedures "attachable" I've been told, and interchangable. 
Different procedures can use the same code which alternatively is 
cheaper in memory usages or disk usage (the old days...). That makes SAS 
by the way a complicated machine to build because procedures who are 
split up into numerous fragments which make complicated bookkeeping. If 
you do it that way, I've been told, you can do a lot of computations 
with very little memory. One guy actually computed quite complicated 
models with "only 32MB or less", which wasn't very much for "his type of 
calculations". Which means that SAS is efficient in memory handling I 
think. It's not very efficient in dollar handling... I estimate.



Certainly true.  In particular, SAS was designed from to store
data items on disk, and to read into core memory the minimum
needed for a particular calculation.

The kind of data SAS handles is (for the most part) limited to
rectangular arrays, similar to R data frames. In many procedures
they can be read from disk sequentially (row by row), which
undoubtedly simplifies memory handling.  It seems logical to
suppose that in developing SAS, algorithms were chosen to
support that style of memory management. Finally, the style of
writing programs in SAS consists of discrete steps of
computation, between which nothing but the program need be held
in core memory.

"Gabor Grothendieck" <ggrothendieck op gmail.com> wrote:

> I think SAS was developed at a time when computer memory was
> much smaller than it is now and the legacy of that is its better
> usage of computer resources.
> On 4/10/07, Wensui Liu <liuwensui op gmail.com> wrote:
> > Greg,
> > As far as I understand, SAS is more efficient handling large data
> > probably than S+/R. Do you have any idea why?

Mike Prager, NOAA, Beaufort, NC
* Opinions expressed are personal and not represented otherwise.
* Any use of tradenames does not constitute a NOAA endorsement.

R-help op stat.math.ethz.ch mailing list
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

No virus found in this incoming message.


More information about the R-help mailing list