[R] Need advice on using R with large datasets
Liaw, Andy
andy_liaw at merck.com
Tue Apr 13 18:51:30 CEST 2004
On a dual Opteron 244 with 16GB ram, and
[andy at leo:cb1]% free
total used free shared buffers cached
Mem: 16278648 14552676 1725972 0 229420 3691824
-/+ buffers/cache: 10631432 5647216
Swap: 2096472 13428 2083044
... using freshly compiled R-1.9.0:
> system.time(x <- numeric(1e9))
[1] 3.60 8.09 15.11 0.00 0.00
> object.size(x)/1024^3
[1] 7.45058
Andy
> From: Peter Dalgaard
>
> "Roger D. Peng" <rpeng at jhsph.edu> writes:
>
> > I've been running R on 64-bit SuSE Linux on Opterons for a
> few months
> > now and it certainly runs fine in what I would call standard
> > situations. In particular there seems to be no problem with
> > workspaces > 4GB. But I seldom handle single objects (like
> matrices,
> > vectors) that are > 4GB. The only exception is lists, but I think
> > those are okay since they are composed of various sub-objects (like
> > Peter mentioned).
>
> I just tried, and x <- numeric(1e9) (~8GB) doesn't appear to be a
> problem, except that it takes "forever" since the machine in question
> has only 1GB of memory, and numeric() zero fills the allocated
> block...
>
> --
> O__ ---- Peter Dalgaard Blegdamsvej 3
> c/ /'_ --- Dept. of Biostatistics 2200 Cph. N
> (*) \(*) -- University of Copenhagen Denmark Ph:
> (+45) 35327918
> ~~~~~~~~~~ - (p.dalgaard at biostat.ku.dk) FAX:
> (+45) 35327907
>
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list
> https://www.stat.math.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide!
> http://www.R-project.org/posting-guide.html
>
>
More information about the R-help
mailing list