[R] memory
Sean O'Riordain
sean.oriordain at gmail.com
Tue Aug 30 09:56:52 CEST 2005
Ferran,
What are you trying to do with such a large matrix? with 7e9 cells
and a linear algorithm which is quite unlikely, your problem solution
is likely to take a "very long time"(tm)... just quickly... at one
micro-second per operation (very optimistic?) and 7e9 operations,
thats
> 7e9/1e6/60
[1] 116.6667
minutes...
if we're doing something a little more complicated than linear, say
O(n^2.5) on a square matrix of 7e9 cells, then we're talking
> (7e9^.5)^2.5/1e6/60
[1] 33745.92
minutes...
As Brian Ripley said, if you really want to to this then you must use
another operating system which can handle more than 32-bit addressing,
one such would be linux running and built for a 64-bit platform - of
which there are a few.
cheers!
Sean
On 30/08/05, Ferran Carrascosa <ferran.carrascosa at gmail.com> wrote:
> Thanks Prof Brian for your answers,
> I have read about 'ref' package to work with more efficient memory
> work. Anybody know if this package could help me to work with a
> 700.000 x 10.000 matrix?
>
> I will have problems with ref package on:
> - Limit of 2 Gb in R for Windows.
> -The maximum cells in one object 2*10^9 (aprox.)
>
> Thanks in advance,
> --
> Ferran Carrascosa
>
>
> 2005/8/30, Prof Brian Ripley <ripley at stats.ox.ac.uk>:
> > On Mon, 29 Aug 2005, Ferran Carrascosa wrote:
> >
> > > Hi,
> > >
> > > I have a matrix with 700.000 x 10.000 cells with floating point data.
> > > I would like to work with the entire table but I have a lot of memory
> > > problems. I have read the ?memory
> > > I work with Win 2000 with R2.1.0
> > >
> > > The only solution that I have applied is:
> > >> memory.limit(size=2048)
> > >
> > > But now my problems are:
> > > - I need to work with more than 2 Gb. How I can exceed this limit?
> >
> > Re-read the rw-FAQ, or (preferably) get a more capable OS on a 64-bit CPU.
> >
> > > - When apply some algorithms, the maximum cells in one object 2*10^9
> > > (aprox.) is reached.
> >
> > You will never get that many cells (that is the address space in bytes,
> > and they are several bytes each). Please do as the posting guide asks
> > and report accurately what happened.
> >
> > > Please could you send me some advises/strategies about the work with
> > > large amount of data in R?
> > >
> > > R have a way to work with less memory needs?
> >
> > Your matrix has 7e09 cells (assuming you are using . as a thousands
> > separator) and needs 5.6e10 bytes to store. Your OS has a memory address
> > limit of 3.2e09 bytes. Don't blame R for being limited by your OS.
> >
> > --
> > Brian D. Ripley, ripley at stats.ox.ac.uk
> > Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/
> > University of Oxford, Tel: +44 1865 272861 (self)
> > 1 South Parks Road, +44 1865 272866 (PA)
> > Oxford OX1 3TG, UK Fax: +44 1865 272595
> >
>
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
>
More information about the R-help
mailing list