[R] error loading huge .RData
Peter Dalgaard BSA
p.dalgaard at biostat.ku.dk
Wed Apr 24 14:39:15 CEST 2002
"Liaw, Andy" <andy_liaw at merck.com> writes:
> Patrick,
>
> I appreciate your comments, and practice everything that you preach.
> However, that workspace image contains only 2~3 R objects: the input and
> output of a single R command. I knew there could be problems, so I've
> stripped it down to the bare minimum. Yes, I also kept the commands in a
> script. That single command (in case you want to know: a random forest run
> with 4000 rows and nearly 7000 variables) took over 3 days to run. There's
> not a whole lot I can do here when the data is so large.
Hmm. You could be running into some sort of situation where data
temporarily take up more space in memory than they need to. It does
sound like a bit of a bug if R can write images that are bigger than
it can read. Not sure how to proceed though. Does anyone on R-core
have a similarly big system and a spare gigabyte of disk? Is it
possible to create a mock-up of similarly organized data that displays
the same effect, but takes less than three days?
-p
BTW: Did we ever hear what system this is happening on?
--
O__ ---- Peter Dalgaard Blegdamsvej 3
c/ /'_ --- Dept. of Biostatistics 2200 Cph. N
(*) \(*) -- University of Copenhagen Denmark Ph: (+45) 35327918
~~~~~~~~~~ - (p.dalgaard at biostat.ku.dk) FAX: (+45) 35327907
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
More information about the R-help
mailing list