[R] Need advice on using R with large datasets
Peter Dalgaard
p.dalgaard at biostat.ku.dk
Tue Apr 13 18:33:07 CEST 2004
"Roger D. Peng" <rpeng at jhsph.edu> writes:
> I've been running R on 64-bit SuSE Linux on Opterons for a few months
> now and it certainly runs fine in what I would call standard
> situations. In particular there seems to be no problem with
> workspaces > 4GB. But I seldom handle single objects (like matrices,
> vectors) that are > 4GB. The only exception is lists, but I think
> those are okay since they are composed of various sub-objects (like
> Peter mentioned).
I just tried, and x <- numeric(1e9) (~8GB) doesn't appear to be a
problem, except that it takes "forever" since the machine in question
has only 1GB of memory, and numeric() zero fills the allocated
block...
--
O__ ---- Peter Dalgaard Blegdamsvej 3
c/ /'_ --- Dept. of Biostatistics 2200 Cph. N
(*) \(*) -- University of Copenhagen Denmark Ph: (+45) 35327918
~~~~~~~~~~ - (p.dalgaard at biostat.ku.dk) FAX: (+45) 35327907
More information about the R-help
mailing list