[R] Solutions for memory problems (packages ff, bigmemory)?
hhafner at statistik-hessen.de
hhafner at statistik-hessen.de
Tue Mar 23 15:38:23 CET 2010
Hello,
I want to impute missing values using the package mi. My file has around
28 MB (50.000 observations, 120 variables). My computer is a Windows XP
32-bit machine, 4 GB Duo Core processor.
When I run mi, already after some minutes during iteration 1, I get the
message that a vector allocation fails. Even with a 10% sample of the
dataset, it doesn't work.
I've read about the packages ff and bigmemory. Can they help me that the
mi package doesn't use the RAM for temporary files but the hard drive
instead? Since I'm not a R expert, I don't understand completely how these
packages work. Or are there other packages for a better memory management?
Or is really a 64-bit OS the only solution for this problem? I hope that
there are some suggestions for my problem since many R procedures seem to
be very promising but they don't work on my machine because of the memory
limitation.
Thanks,
Hans-Peter
More information about the R-help
mailing list