[R] Need advice on using R with large datasets
Peter Dalgaard
p.dalgaard at biostat.ku.dk
Tue Apr 13 21:12:04 CEST 2004
"Liaw, Andy" <andy_liaw at merck.com> writes:
> On a dual Opteron 244 with 16GB ram, and
>
> [andy at leo:cb1]% free
> total used free shared buffers cached
> Mem: 16278648 14552676 1725972 0 229420 3691824
> -/+ buffers/cache: 10631432 5647216
> Swap: 2096472 13428 2083044
>
> ... using freshly compiled R-1.9.0:
>
> > system.time(x <- numeric(1e9))
> [1] 3.60 8.09 15.11 0.00 0.00
> > object.size(x)/1024^3
> [1] 7.45058
Well,
> system.time(mean(x))
[1] 15.80 20.94 1323.01 0.00 0.00
> object.size(x)/1024^3
[1] 7.45058
I suppose I just have to look forward to RAM prices dropping...
(Actually, the OS should be able to do better. Should be able to read
the data from disk at about 20s/GB.)
--
O__ ---- Peter Dalgaard Blegdamsvej 3
c/ /'_ --- Dept. of Biostatistics 2200 Cph. N
(*) \(*) -- University of Copenhagen Denmark Ph: (+45) 35327918
~~~~~~~~~~ - (p.dalgaard at biostat.ku.dk) FAX: (+45) 35327907
More information about the R-help
mailing list