[R] How to deal with more than 6GB dataset using R?
Greg Snow
Greg.Snow at imail.org
Sat Jul 24 21:55:59 CEST 2010
You may want to look at the biglm package as another way to regression models on very large data sets.
--
Gregory (Greg) L. Snow Ph.D.
Statistical Data Center
Intermountain Healthcare
greg.snow at imail.org
801.408.8111
> -----Original Message-----
> From: r-help-bounces at r-project.org [mailto:r-help-bounces at r-
> project.org] On Behalf Of babyfoxlove1 at sina.com
> Sent: Friday, July 23, 2010 10:10 AM
> To: r-help at r-project.org
> Subject: [R] How to deal with more than 6GB dataset using R?
>
> Hi there,
>
> Sorry to bother those who are not interested in this problem.
>
> I'm dealing with a large data set, more than 6 GB file, and doing
> regression test with those data. I was wondering are there any
> efficient ways to read those data? Instead of just using read.table()?
> BTW, I'm using a 64bit version desktop and a 64bit version R, and the
> memory for the desktop is enough for me to use.
> Thanks.
>
>
> --Gin
>
> [[alternative HTML version deleted]]
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-
> guide.html
> and provide commented, minimal, self-contained, reproducible code.
More information about the R-help
mailing list