[R] Another big data size problem

Ernesto Jardim ernesto at ipimar.pt
Wed Jul 28 14:28:20 CEST 2004


On Wed, 2004-07-28 at 12:40, Christian Schulz wrote:
> Hi,
> 
> i'm working with a ~ 250.000  * 150  data.frame and can share 
> your problems - i've upgraded last weekend my notebook 
> from 512MB -> 1024MB, it's really better especially for load, write.table , 
> mysqlReadTable, mysqlWriteTable, because machine begin  caching if RAM
> is full. One example: 
> With 512MB i get after some hours no success write a table to mysql.
> With 1024MB it does in some minutes.
> 

Hi,

When you're writing a table to MySQL you have to be carefull if the
table is created by RMySQL. The fields definition may not be the most
adequate and there will be no indexes in your table, which makes the
queries _very_ slow.

Regards

EJ




More information about the R-help mailing list