[R] memory increase for large r simulation

Christoph Lehmann christoph.lehmann at gmx.ch
Tue Sep 28 10:13:13 CEST 2004


I don't know your details, but some remarks, which might be concerned 
with memory problems under windows

(1) consider e.g. linux, which is better able to manage 2Gb of RAM.

(2) look at the gc() (garbage collector): call it after a rm() command, 
to really free memory

(3)allocate memory for arrays this way:
x <- rep(0, 64 * 64 * 16 * 1000)
dim(x) <- c(64,64,16,1000)
and call gc() after this definition, since even this way twice the 
memory necessary for x is assigned

Cheers

Christoph

shane stanton wrote:
> Hi,
> 
> I am running R from windows 95. I have a large
> simulation, which R does not get very far through
> before telling me:
> 
> "error  cannot allocate vector of size 871875 Kb"
> 
> and warning message "reached total allocation of 127
> Mb"
> 
> I have been trying to increase the memory allocation
> to R from my computer, using various commands at the R
> prompt, such as memory.limit(size=......) to which R
> responds "NULL" or "cannot decrease memory limit" (no
> matter how large I try to make the argument of
> memory.limit. 
> 
> Does anybody have any ideas re how I can get this
> simulation to run?
> 
> Many thanks,
> 
> Shane Stanton
> 
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
> 
>




More information about the R-help mailing list