[R] Another big data size problem

Christian Schulz ozric at web.de
Wed Jul 28 13:40:19 CEST 2004


Hi,

i'm working with a ~ 250.000  * 150  data.frame and can share 
your problems - i've upgraded last weekend my notebook 
from 512MB -> 1024MB, it's really better especially for load, write.table , 
mysqlReadTable, mysqlWriteTable, because machine begin  caching if RAM
is full. One example: 
With 512MB i get after some hours no success write a table to mysql.
With 1024MB it does in some minutes.

regards, christian


Am Mittwoch, 28. Juli 2004 04:10 schrieb Federico Gherardini:
> Hi all,
>
> I'm trying to read a 1220 * 20000 table in R but I'm having lot of
> problems. Basically what it happens is that R.bin starts eating all my
> memory until it gets about 90%. At that point it locks itself in a 
> uninterruptible sleep status (at least that's what top says) where it just
> sits there barely using the cpu at all but keeping its tons of memory. I've
> tried with read.table and scan but none of them did the trick. I've also
> tried some orrible hack like reading one line a time and gradually
> combining everything in a matrix using rbind... nope! It seems I can read
> up to 500 lines in a *decent* time but nothing more. The machine is a 3 GHz
> P4 with HT and 512 MB RAM running R-1.8.1. Will I have to write a little a
> C program myself to handle this thing or am I missing something?
>
> Thanks in advance for your help,
>
> fede
>
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list
> https://www.stat.math.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide!
> http://www.R-project.org/posting-guide.html




More information about the R-help mailing list