[R] Can we do GLM on 2GB data set with R?
WILLIE, JILL
JILWIL at SAFECO.com
Sun Jan 21 02:26:50 CET 2007
We are wanting to use R instead of/in addition to our existing stats
package because of it's huge assortment of stat functions. But, we
routinely need to fit GLM models to files that are approximately 2-4GB
(as SQL tables, un-indexed, w/tinyint-sized fields except for the
response & weight variables). Is this feasible, does anybody know,
given sufficient hardware, using R? It appears to use a great deal of
memory on the small files I've tested.
I've read the data import, memory.limit, memory.size & general
documentation but can't seem to find a way to tell what the boundaries
are & roughly gauge the needed memory...other than trial & error. I've
started by testing the data.frame & run out of memory on my PC. I'm new
to R so please be forgiving if this is a poorly-worded question.
Jill Willie
Open Seas
Safeco Insurance
jilwil at safeco.com
206-545-5673
More information about the R-help
mailing list