[R] practical memory limits
ivo welch
ivowel at gmail.com
Sat Feb 10 19:02:46 CET 2007
Dear R experts: I want to learn what the practically useful memory
limits are for good work with R.
(My specific problem is that I want work with daily stock returns.
In ASCII, the data set is about 72 million returns, that would have to
go into a sparse matrix (not all stocks exist for the whole series).
As a guess, this will consume about 700MB. My main use will be linear
operations---regressions, means, etc.)
I am on linux, so I can create swap space, but I am concerned that the
thrashing will be so bad that the computer will become worthless. In
fact, the last time I used it was over 3 years ago. Since then, I
have just turned it off.
I have 2GB of RAM right now, and could upgrade this to 4GB.
Are there some general guidelines as to what the relationship between
data sets and memory should be under R? I know this will vary with
the task involved, but some guidance would be better than none.
regards,
/iaw
More information about the R-help
mailing list