Thomas W Blackwell
tblackw at umich.edu
Wed Apr 23 20:45:18 CEST 2003
Use your existing machine. Here's a rough calculation:
60,000 rows x 15 columns x 8 bytes = 7.2 Mb per copy of the
data set x 10 - 20 copies of the data set in memory while
you do the calculations = 72 - 144 Mb memory requirements.
Is it 12 bytes per double instead of 8 in this implementation
of the S language ? (I think it is 12 for S-Plus.) Have I
missed a factor of 10 somewhere here ?
I think you should be okay with your existing machine.
Close other processes when you do the analysis.
- tom blackwell - u michigan medical school - ann arbor -
On Wed, 23 Apr 2003, Ruud H. Koning wrote:
> Hello, it is likely that I will have to analyze a rather sizeable dataset:
> 60000 records, 10 to 15 variables. I will have to make descriptive
> statistics, and estimate linear models, glm's and maybe Cox proportional
> hazard model with time varying covariates. In theory, this is possible in
> R, but I would like to get some feedback on the equipment I should get for
> this. At this moment, I have a Pentium 3 laptop running windows 2000 with
> 384MB ram. What type of cpu-speed and/or how much memory should I get?
> Thanks for some ideas, Ruud
More information about the R-help