[R] Large Dataset

Edwin Sendjaja edwin7 at web.de
Tue Jan 6 13:24:44 CET 2009


Hi Simon,

Thank for your reply. 
I have read ?Memory but I dont understand how to use. I am not sure if that 
can solve my problem. Can you tell me more detail?

Thanks,

Edwin

> type
>
> ?memory
>
> into R and that will explain what to do...
>
> S
> ----- Original Message -----
> From: "Edwin Sendjaja" <edwin7 at web.de>
> To: <r-help at r-project.org>
> Sent: Tuesday, January 06, 2009 11:41 AM
> Subject: [R] Large Dataset
>
> > Hi alI,
> >
> > I  have a 3.1 GB Dataset ( with  11 coloumns and lots data in int and
> > string).
> > If I use read.table; it takes very long. It seems that my RAM is not big
> > enough (overload) I have 3.2 RAM and  7GB SWAP, 64 Bit Ubuntu.
> >
> > Is there a best sultion to read a large data R? I have seen, that people
> > suggest to use bigmemory package, ff. But it seems very complicated.  I
> > dont
> > know how to start with that packages.
> >
> > i have tried to use bigmemory. But I got some kind of errors.  Then I
> > gave up.
> >
> >
> > can someone give me an simple example how ot use ff or bigmemory?or maybe
> > re
> > better sollution?
> >
> >
> >
> > Thank you in advance,
> >
> >
> > Edwin
> >
> > ______________________________________________
> > R-help at r-project.org mailing list
> > https://stat.ethz.ch/mailman/listinfo/r-help
> > PLEASE do read the posting guide
> > http://www.R-project.org/posting-guide.html
> > and provide commented, minimal, self-contained, reproducible code.




More information about the R-help mailing list