[R] memory
Ferran Carrascosa
ferran.carrascosa at gmail.com
Tue Aug 30 09:39:37 CEST 2005
Thanks Prof Brian for your answers,
I have read about 'ref' package to work with more efficient memory
work. Anybody know if this package could help me to work with a
700.000 x 10.000 matrix?
I will have problems with ref package on:
- Limit of 2 Gb in R for Windows.
-The maximum cells in one object 2*10^9 (aprox.)
Thanks in advance,
--
Ferran Carrascosa
2005/8/30, Prof Brian Ripley <ripley at stats.ox.ac.uk>:
> On Mon, 29 Aug 2005, Ferran Carrascosa wrote:
>
> > Hi,
> >
> > I have a matrix with 700.000 x 10.000 cells with floating point data.
> > I would like to work with the entire table but I have a lot of memory
> > problems. I have read the ?memory
> > I work with Win 2000 with R2.1.0
> >
> > The only solution that I have applied is:
> >> memory.limit(size=2048)
> >
> > But now my problems are:
> > - I need to work with more than 2 Gb. How I can exceed this limit?
>
> Re-read the rw-FAQ, or (preferably) get a more capable OS on a 64-bit CPU.
>
> > - When apply some algorithms, the maximum cells in one object 2*10^9
> > (aprox.) is reached.
>
> You will never get that many cells (that is the address space in bytes,
> and they are several bytes each). Please do as the posting guide asks
> and report accurately what happened.
>
> > Please could you send me some advises/strategies about the work with
> > large amount of data in R?
> >
> > R have a way to work with less memory needs?
>
> Your matrix has 7e09 cells (assuming you are using . as a thousands
> separator) and needs 5.6e10 bytes to store. Your OS has a memory address
> limit of 3.2e09 bytes. Don't blame R for being limited by your OS.
>
> --
> Brian D. Ripley, ripley at stats.ox.ac.uk
> Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/
> University of Oxford, Tel: +44 1865 272861 (self)
> 1 South Parks Road, +44 1865 272866 (PA)
> Oxford OX1 3TG, UK Fax: +44 1865 272595
>
More information about the R-help
mailing list