[R] Large datasets in R

Gabor Grothendieck ggrothendieck at gmail.com
Mon Jul 17 21:10:07 CEST 2006

You may or may not have problems.  R keeps its data in memory so
you will have to have sufficient memory to hold the data plus all
derived data and code.   Since R is free you can try it out.  If your
problems are
too large you can always get more memory or use S-Plus which can
handle larger datasets and the code is similar to R so you can largely
reuse your code.

On 7/17/06, Deepankar Basu <basu.15 at osu.edu> wrote:
> Hi!
> I am a student of economics and currently do most of my statistical work
> using STATA. For various reasons (not least of which is an aversion for
> proprietary software), I am thinking of shifting to R. At the current
> juncture my concern is the following: would I be able to work on
> relatively large data-sets using R? For instance, I am currently working
> on a data-set which is about 350MB in size. Would be possible to work
> data-sets of such sizes using R?
> I have been trying to read up the posting on the R-archive on this
> topic; but I could not really understand all the discussion, nor could I
> reach the "end". So, I am not aware of the current state of consensus on
> the issue.
> It would help a lot if some current user could throw some light on this
> issue of large data-sets in R.
> Thanks in advance.
> Deepankar Basu
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

More information about the R-help mailing list