[R] naive question
Peter Dalgaard
p.dalgaard at biostat.ku.dk
Wed Jun 30 20:31:20 CEST 2004
<rivin at euclid.math.temple.edu> writes:
> I did not use R ten years ago, but "reasonable" RAM amounts have
> multiplied by roughly a factor of 10 (from 128Mb to 1Gb), CPU speeds have
> gone up by a factor of 30 (from 90Mhz to 3Ghz), and disk space availabilty
> has gone up probably by a factor of 10. So, unless the I/O performance
> scales nonlinearly with size (a bit strange but not inconsistent with my R
> experiments), I would think that things should have gotten faster (by the
> wall clock, not slower). Of course, it is possible that the other
> components of the R system have been worked on more -- I am not equipped
> to comment...
I think your RAM calculation is a bit off. in late 1993, 4MB systems
were the standard PC, with 16 or 32 MB on high-end workstations.
Comparable figures today are probably 256MB for the entry-level PC
and a couple GB in the high end. So that's more like a factor of 64.
On the other hand, CPU's have changed by more than the clock speed; in
particular, the number of clock cycles per FP calculation has
decreased considerably and is currently less than one in some
circumstances.
--
O__ ---- Peter Dalgaard Blegdamsvej 3
c/ /'_ --- Dept. of Biostatistics 2200 Cph. N
(*) \(*) -- University of Copenhagen Denmark Ph: (+45) 35327918
~~~~~~~~~~ - (p.dalgaard at biostat.ku.dk) FAX: (+45) 35327907
More information about the R-help
mailing list