[R] naive question
rivin@euclid.math.temple.edu
rivin at euclid.math.temple.edu
Wed Jun 30 22:25:53 CEST 2004
> <rivin at euclid.math.temple.edu> writes:
>
>> I did not use R ten years ago, but "reasonable" RAM amounts have
>> multiplied by roughly a factor of 10 (from 128Mb to 1Gb), CPU speeds
>> have gone up by a factor of 30 (from 90Mhz to 3Ghz), and disk space
>> availabilty has gone up probably by a factor of 10. So, unless the I/O
>> performance scales nonlinearly with size (a bit strange but not
>> inconsistent with my R experiments), I would think that things should
>> have gotten faster (by the wall clock, not slower). Of course, it is
>> possible that the other components of the R system have been worked on
>> more -- I am not equipped to comment...
>
> I think your RAM calculation is a bit off. in late 1993, 4MB systems
> were the standard PC, with 16 or 32 MB on high-end workstations.
I beg to differ. In 1989, Mac II came standard with 8MB, NeXT came
standard with 16MB. By 1994, 16MB was pretty much standard on good quality
(= Pentium, of which the 90Mhz was the first example) PCs, with 32Mb
pretty common (though I suspect that most R/S-Plus users were on SUNs,
which were somewhat more plushly equipped).
> Comparable figures today are probably 256MB for the entry-level PC and
> a couple GB in the high end. So that's more like a factor of 64. On the
> other hand, CPU's have changed by more than the clock speed; in
> particular, the number of clock cycles per FP calculation has
> decreased considerably and is currently less than one in some
> circumstances.
>
I think that FP performance has increased more than integer performance,
which has pretty much kept pace with the clock speed. The compilers have
also improved a bit...
Igor
More information about the R-help
mailing list