[R] cannot.allocate.memory.again and 32bit<--->64bit
Prof Brian Ripley
ripley at stats.ox.ac.uk
Tue Nov 15 12:50:21 CET 2005
You need to understand that your process has (normally) 3GB of user
address space, not of memory.
The `gc trigger' is the value at which an automated gc is triggered, but
one is also triggered if there is no large enough memory block left to
allocate. So it is unrelated to the message about not being able to
allocate memory. And it does need to be bigger than the maximum actual
use: you have had three copies of your object in use at once (and now have
two).
The problem with a small address space is that it can easily become
fragmented, and a 64-bit system avoids that. Generally we would not want
more than about 1/4 of the address space used to avoid fragmentation.
BTW, setting the dim on your vector is a much more efficient way to do this
> my.vector<-rnorm(100000*500)
> dim(my.vector) <- c(100000,500)
> gc()
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 169775 4.6 350000 9.4 350000 9.4
Vcells 50063257 382.0 50415499 384.7 50064028 382.0
so perhaps it is time for you to learn some of these tricks. If you
study matrix() you will see where the extra copy comes from. (Hint: it is
more-or-less needed if byrow=TRUE.)
On Tue, 15 Nov 2005, someone with a broken shift key wrote:
> hello!
> ------
>
> i use 32bit.Linux(SuSe)Server, so i'm limited with 3.5Gb of memory
> i demonstrate, that there is times to times a problem with allocating of
> objects of large size, for example
>
>
> 0.state (no objects yet created)
> ------------------------------------
>
>> gc()
> used (Mb) gc trigger (Mb) max used (Mb)
> Ncells 162070 4.4 350000 9.4 350000 9.4
> Vcells 59921 0.5 786432 6.0 281974 2.2
>
>
>
> 1.state: let create now a vector of large size
> --------------------------------------------------
>
>> my.vector<-rnorm(100000*500)
>> object.size(my.vector)/1024^2
> [1] 381.4698
>> 100000*500*8/1024^2 #calculate object.size directly
> [1] 381.4697
>> gc()
> used (Mb) gc trigger (Mb) max used (Mb)
> Ncells 162257 4.4 350000 9.4 350000 9.4
> Vcells 50060239 382.0 50412232 384.7 50060419 382.0
>
>
>
> 3.state: well, let create a matrix of the same size from this vector
> --------------------------------------------------------------------------
>
>> my.matrix<-matrix(my.vector,nrow=100000,ncol=500)
>> gc()
> used (Mb) gc trigger (Mb) max used
> (Mb)
> Ncells 162264 4.4 350000 9.4 350000
> 9.4
> Vcells 100060241 763.4 150315042 1146.9 150060261 1144.9
>> object.size(my.matrix)/1024^2 #calculate object.size directly
> [1] 381.4698
>
>
> so, the matrix actually - according to the used.Mb - needs the same Mb as the
> vector.
> but, the trigger.Mb - and i still have problems with understanding of this -
> grows ennormously.
> and i can sure, i had received the "cannot allocate the vector of
> xxxKb"-error last time, trying the same experiment.
>
> if we know, that the matrix (or array generally) is acctually alloccated as a
> vector (with removed dimensions), why do we need so much trigger.Mb for it?
>
> is it a problem for R only on a 32bit? what is the difference with recpect to
> trigger.memory if i use 64bit (i didn't yet)?
--
Brian D. Ripley, ripley at stats.ox.ac.uk
Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel: +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UK Fax: +44 1865 272595
More information about the R-help
mailing list