[R] vsize and nsize

Thomas Lumley thomas at biostat.washington.edu
Tue May 18 17:47:59 CEST 1999

On Tue, 18 May 1999, Jim Lindsey wrote:

> I am wondering what you mean by "R's poor handling of large datasets".
> How large is large? I have often been working simultaneously with a
> fair number of vectors of say 40,000 using my libraries (data objects
> and functions) with no problems. They use the R scoping rules. On the
> other hand, if you use dataframes and/or standard functions like glm,
> then you are restricted to extremely small (toy) data sets. But then
> maybe you are thinking of gigabytes of data.

While I agree that R isn't much use for really big datasets, I am a bit
surprised by Jim's comments about glm(). I have used R (including glm)
perfectly satisfactorily on real data sets of ten thousand or so
observations. This isn't large by today's standards but it isn't in
any sense toy data.

Thomas Lumley
Assistant Professor, Biostatistics
University of Washington, Seattle

r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch

More information about the R-help mailing list