[R] Reading in 9.6GB .DAT File - OK with 64-bit R?

Steve Lianoglou mailinglist.honeypot at gmail.com
Fri Mar 9 00:16:52 CET 2012


Hi,

On Thu, Mar 8, 2012 at 1:19 PM, RHelpPlease <rrumple at trghcsolutions.com> wrote:
> Hi there,
> I wish to read a 9.6GB .DAT file into R (64-bit R on 64-bit Windows machine)
> - to then delete a substantial number of rows & then convert to a .csv file.
> Upon the first attempt the computer crashed (at some point last night).
>
> I'm rerunning this now & am closely monitoring Processor/CPU/Memory.
>
> Apart from this crash being a computer issue alone (possibly), is R equipped
> to handle this much data?  I read up on the FAQs page that 64-bit R can
> handle larger data sets than 32-bit.
>
> I'm using the read.fwf function to read in the data.  I don't have access to
> a database program (SQL, for instance).

Keep in mind that sqlite3 is just a `install.packages('RSQLite')` away ...

and this SO thread might be useful w.r.t sqlite performance and big db files:

http://stackoverflow.com/questions/784173

HTH,
-steve

-- 
Steve Lianoglou
Graduate Student: Computational Systems Biology
 | Memorial Sloan-Kettering Cancer Center
 | Weill Medical College of Cornell University
Contact Info: http://cbio.mskcc.org/~lianos/contact



More information about the R-help mailing list