[R] cannot.allocate.memory.again and 32bit<--->64bit
Pavel Khomski
pkhomski at wiwi.uni-bielefeld.de
Tue Nov 15 11:00:11 CET 2005
hello!
------
i use 32bit.Linux(SuSe)Server, so i'm limited with 3.5Gb of memory
i demonstrate, that there is times to times a problem with allocating of
objects of large size, for example
0.state (no objects yet created)
------------------------------------
> gc()
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 162070 4.4 350000 9.4 350000 9.4
Vcells 59921 0.5 786432 6.0 281974 2.2
1.state: let create now a vector of large size
--------------------------------------------------
> my.vector<-rnorm(100000*500)
> object.size(my.vector)/1024^2
[1] 381.4698
> 100000*500*8/1024^2 #calculate object.size directly
[1] 381.4697
> gc()
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 162257 4.4 350000 9.4
350000 9.4
Vcells 50060239 382.0 50412232 384.7 50060419 382.0
3.state: well, let create a matrix of the same size from this vector
--------------------------------------------------------------------------
> my.matrix<-matrix(my.vector,nrow=100000,ncol=500)
> gc()
used (Mb) gc trigger (Mb) max
used (Mb)
Ncells 162264 4.4 350000 9.4
350000 9.4
Vcells 100060241 763.4 150315042 1146.9 150060261 1144.9
> object.size(my.matrix)/1024^2 #calculate object.size directly
[1] 381.4698
so, the matrix actually - according to the used.Mb - needs the same Mb
as the vector.
but, the trigger.Mb - and i still have problems with understanding of
this - grows ennormously.
and i can sure, i had received the "cannot allocate the vector of
xxxKb"-error last time, trying the same experiment.
if we know, that the matrix (or array generally) is acctually alloccated
as a vector (with removed dimensions), why do we need so much trigger.Mb
for it?
is it a problem for R only on a 32bit? what is the difference with
recpect to trigger.memory if i use 64bit (i didn't yet)?
thanks for your advice
--------------------------
More information about the R-help
mailing list