[R] Erro: cannot allocate vector of size 216.0 Mb

José Augusto Jr. jamajbr at gmail.com
Mon Jul 21 13:25:40 CEST 2008


Dear all,

Thank you by your attention.

1) I'm using a Core 2 Duo CPU with 2MB physical memory and Windows Vista
2) The main  function, that´s causing the error, is embedd(x=data,d,t).
3) The time series that i´m using has 1.000.000 observations of real numbers.
4) Sometimes the function works, sometimes, not.

Some things i´m doing:

1) I have put x <- NULL and gc() in the end of each memory-intensive
routine (embedding), to release memory.

2) I installed Ubuntu linux in the same machine and will try the same routine.
"Efective men use Unix" :)

And my plans to the future:

3) If this not work, i will try to rewrite the code to reduce memory
requirements.

4) If this not work, i will try to parallelize the code, by using snow
ou Rmpi, something like this.

5) If this  not work, i will try to use a cluster, with sufficient
memory to let this work.

Any suggestions?

Many thanks.

Regards,

jamaj


2008/7/21, Prof Brian Ripley <ripley at stats.ox.ac.uk>:
> On Mon, 21 Jul 2008, Uwe Ligges wrote:
>
> > Several questions:
> >
> > - Before we go ahead: Are you sure 3 Gb are sufficient for your problem?
> > - Which OS (I guess Windows)?
> >
>
> (The only platform on which these functions are supported.)
>
> > - Which version of R (let's assume R-2.7.1)?
> > - Is your Windows 3GB enabled in the boot flags, or is it a 64-bit version
> of Windows?
> >
>
> (No, or the default memory limit would be higher than 1.5Gb.  R by default
> uses as high a memory limit as is sensible if (as here) the address space is
> the limiting factor.)
>
>
> >
> >
> > Best wishes,
> > Uwe Ligges
> >
> >
> >
> > José Augusto Jr. wrote:
> >
> > > Please,
> > >
> > > I have a 2GB computer and a huge time-series to embedd, and i tried
> > > increasing memory.limit() and memory.size(max=TRUE), but nothing.
> > >
> > > Just before the command:
> > >
> > >
> > > > memory.size(max=TRUE)
> > > >
> > > [1] 13.4375
> > >
> > > > memory.limit()
> > > >
> > > [1] 1535.875
> > >
> > > > gc()
> > > >
> > >         used (Mb) gc trigger (Mb) max used (Mb)
> > > Ncells 209552  5.6     407500 10.9   350000  9.4
> > > Vcells 125966  1.0     786432  6.0   496686  3.8
> > >
> > >
> > > I  increased the memory limit:
> > >
> > >
> > > > memory.limit(3000)
> > > >
> > > NULL
> > >
> > > > memory.limit()
> > > >
> > > [1] 3000
> > >
> > > > memory.size()
> > > >
> > > [1] 11.33070
> > >
> > > > memory.size(max=TRUE)
> > > >
> > > [1] 13.4375
> > >
> > > > gc()
> > > >
> > >         used (Mb) gc trigger (Mb) max used (Mb)
> > > Ncells 209552  5.6     407500 10.9   350000  9.4
> > > Vcells 125964  1.0     786432  6.0   496686  3.8
> > >
> > >
> > > And even trying to increase the memory.limits, i still get and error.
> > >
> > > Any sugestions?
> > >
> > >
> > > Thanks in  advance.
> > >
> > > jama
> > >
> > > ______________________________________________
> > > R-help at r-project.org mailing list
> > > https://stat.ethz.ch/mailman/listinfo/r-help
> > > PLEASE do read the posting guide
> http://www.R-project.org/posting-guide.html
> > > and provide commented, minimal, self-contained, reproducible code.
> > >
> >
> > ______________________________________________
> > R-help at r-project.org mailing list
> > https://stat.ethz.ch/mailman/listinfo/r-help
> > PLEASE do read the posting guide
> http://www.R-project.org/posting-guide.html
> > and provide commented, minimal, self-contained, reproducible code.
> >
> >
>
> --
> Brian D. Ripley,                  ripley at stats.ox.ac.uk
> Professor of Applied Statistics,
> http://www.stats.ox.ac.uk/~ripley/
> University of Oxford,             Tel:  +44 1865 272861 (self)
> 1 South Parks Road,                     +44 1865 272866 (PA)
> Oxford OX1 3TG, UK                Fax:  +44 1865 272595



More information about the R-help mailing list