[R] Large data files
cstrato at EUnet.at
Wed Dec 29 21:00:50 CET 1999
Dear R and S-Plus users:
Currently I am using:
at work: "S-Plus 2000 Pro" on a PC: Pentium II/350MHz, 256 MB RAM,
running Win NT
at home: "R" on my Mac PowerBook G3/292MHz, 128 MB RAM, running LinuxPPC
Currently, at home I am trying to import a table(nrow=302500, ncol=6)
which I have to do
for each column extra because of memory problems. I have partially to
use the columns,
partially I have to convert them in to matrices(550 x 550) for doing
Ultimately, I have to import many (ca 20-100) of these tables, which
will be impossible
on my current machines due to memory limitations.
My question now is the following:
At work I have access to the following multiprocessor machines:
a, Compaq Proliant Server: 4 x Pentium II/450MHz, 2 GB RAM, Win NT
b, Sun Enterprise 450 Server: 4 x SPARC/??MHz, 2 GB RAM, Solaris 2.6
For testing purposes I would like to install "R":
1, Can R take advantage of multiprocessor machines?
2, Which machine would be better suited to run R on?
Finally, the question is:
Is R or S-Plus better suited for handling such large data?
Would "S-Plus 2000" for Win NT or "S-Plus 5" for Unix better suited?
Can S-Plus take advantage of multiprocessor machines?
Thank you in advance for your help
and Happy New Year 2000 (hopefully not 1900)
Christian Stratowa, Ph.D., Vienna
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch
More information about the R-help