[R] RMySQL query: why result takes so much memory in R ?

christian schulz ozric at web.de
Mon May 2 18:19:39 CEST 2005


Hi,

IMHO  you need only when your columns are numeric
X rows /100/100/8  MB.
 >>(12000000*3)/100/100/8
[1] 450

But one of your columns  is  group char.
I'm suffering in the past in lot of things with massive data and R and
recognize doing how many as possible in the database, or you have to
upgrade your computer to 2-4GB like a database machine!?

regards, christian


Christoph Lehmann schrieb:

> Hi
> I just started with RMySQL. I have a database with roughly 12 millions 
> rows/records and 8 columns/fields.
>
> From all 12 millions of records I want to import 3 fields only.
> The fields are specified as:id int(11), group char(15), measurement 
> float(4,2).
> Why does this take > 1G RAM? I run R on suse linux, with 1G RAM and 
> with the code below it even fills the whole 1G of swap. I just don't 
> understand how 12e6 * 3 can fill such a huge range of RAM? Thanks for 
> clarification and potential solutions.
>
>
> ## my code
> library(RMySQL)
> drv <- dbDriver("MySQL")
> ch <- dbConnect(drv,dbname="testdb",
>                 user="root",password="mysql")
> testdb <- dbGetQuery(ch,
>        "select id, group, measurement from mydata")
> dbDisconnect(ch)
> dbUnloadDriver(drv)
>
> ## end of my code
>
> Cheers
> Christoph
>
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide! 
> http://www.R-project.org/posting-guide.html
>




More information about the R-help mailing list