[R] Help with large datasets

Steven Boker sboker at mpipks-dresden.mpg.de
Tue Mar 20 20:17:28 CET 2001


I am a new user of R (1.2.2 on Alpha), but have been using S and then
Splus heavily for about 10 years now.  My problem is this.  The data
I analyze comprise large sets.  Typically I am analyzing 5000 observations
on 90 variables over several hundred subjects.  Sometimes 500,000
subjects with 200 variables.  Unfortunately, although my Alpha has
1.5 gig of ram, R as it is configured seems to be set for a maximum
of about 100Mb of workspace (as best I can tell).  The published
command line switch seem to be able to restrict memory parameters, but
not enlarge the main workspace.

What I'd like to do is to make the workspace much larger (10x).  Either
on the fly, if possible, or by changing the appropriate #defines
and recompiling so as to be able to analyze my admittedly excessive data.

Is there a short happy answer to my plea?

Thanks in advance,
Steve


-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._



More information about the R-help mailing list