[R] Re : Memory in R.

R RR rstat.diallo at gmail.com
Wed Apr 2 16:46:49 CEST 2008


Dear R users,
Many thanks for your answers.
I've made much progress since my last posting.

I have now the followuing problem. I've run the GAM model

mygam <- gam(Y ~ factor(year)
+ m1.q02 + m1.q05y + m1.q05y2 + m1.q06 + m4b.q05 + m4b.q052 + m5a.q01
+ depratio + depratio2 + residence10y + urbrur + factor(prefect)
+ m1.q02_ps
+ m1.q05y_ps
+ m1.q05y2_ps
+ m1.q06_ps
+ m4b.q05_ps
+ m4b.q052_ps
+ m5a.q01_ps
+ depratio_ps
+ depratio2_ps
+ residence10y_ps
+ urbrur_ps
                  +factor(hhid), data=cwp2)

and obtained the following error code:

Erreur : impossible d'allouer un vecteur de taille 236.3 Mo
(in english: cannot allocate a 236.3 Mo vector memory).

I have 7237 observations in my data.

Is there any way to increase the memory to fit the model?

Best regards.

Amadou DIALLO



2008/4/1, Liviu Andronic <landronimirc at gmail.com>:
> Hello Amadou,
>
> On Mon, Mar 31, 2008 at 4:53 PM, R RR <rstat.diallo at gmail.com> wrote:
> >  1 - I want to increase memory, but I can't find
> >  the right way. E.g: in stata, you just type "set memory 1g".
> >  When I try to load huge datasets, R crashes.
>
> You will find this recent thread [1] interesting. You'd also want to
> check packages filehash, ff and sqldf.
> Regards,
> Liviu
>
> [1] http://www.nabble.com/How-to-read-HUGE-data-sets--tt15729830.html
>



More information about the R-help mailing list