[R] (no subject)

jim holtman jholtman at gmail.com
Sun Sep 16 05:41:46 CEST 2007


When you say you can not import 4.8GB, is this the size of the text
file that you are reading in?  If so, what is the structure of the
file?  How are you reading in the file ('read.table', 'scan', etc).

Do you really need all the data or can you work with a portion at a
time?  If so, then consider putting the data in a database and
retrieving the data as needed.  If all the data is in an object, how
big to you think this object will be? (# rows, # columns, mode of the
data).

So you need to provide some more information as to the problem that
you are trying to solve.

On 9/15/07, tkobayas at indiana.edu <tkobayas at indiana.edu> wrote:
> Hi,
>
> Let me apologize for this simple question.
>
> I use 64 bit R on my Fedora Core 6 Linux workstation. A 64 bit R has
> saved a lot of time. I am sure this is a lot to do with my memory
> limit, but I cannot import 4.8GB. My workstation has a 8GB RAM, Athlon
> X2 5600, and 1200W PSU. This PC configuration is the best I could get.
>
> I know a bit of C and Perl. Should I use C or Perl to manage this large
> dataset? or should I even go to 16GB RAM.
>
> Sorry for this silly question. But I appreciate if anyone could give me
> advice.
>
> Thank you very much.
>
> TK
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>


-- 
Jim Holtman
Cincinnati, OH
+1 513 646 9390

What is the problem you are trying to solve?



More information about the R-help mailing list