[R] allocating vector memory > 1 GByte on Windows XP / Vista / 7

Duncan Murdoch murdoch at stats.uwo.ca
Mon Nov 30 20:25:56 CET 2009


In answer to your 4th question:  you are trying to allocate an object 
whose size is 1/4 of the addressable memory on your machine.  You might 
be lucky and have that much available in one piece, but I'm guessing you 
don't, which is why the allocation fails.  So don't worry about 
questions 1,2, or 3, switch to a 64 bit build of R on a 64 bit OS, where 
your allocation is a tiny bit of the addressable memory.

I don't know the answer to 5.

Duncan Murdoch

On 30/11/2009 12:34 PM, Jochen Albrecht wrote:
> Let me begin stating that I read all help files and faq's on the subject 
> matter (there aren't more than about a dozen) but either did not find 
> solutions or found them not to work.
> Here is the issue. I am trying to run a spatial regression on a 
> medium-sized dataset. Part of the functions in the spdep package I use 
> require me to allocate a vector of 1.1 Gb (mine is not a spatial SIG 
> question, I am using this just as an example). I have been trying my 
> luck on two different machines, each with 4 GB of real RAM, plus a 
> hundred GB of swap space. One machine runs XP, the other Windows 7.
> I have tried to work with startup parameters --min-vsize, --min-nsize, 
> --max-vsize, max-nsize. The limit here seems to be 2048 M - I am getting 
> error messages if I try to allocate more. I can increase the general 
> memory allocation using memory.limit(3072), and gc() then reports that I 
> have 3 Gigs available, although I don't know how this is split into cons 
> and heap cells.
> In any case, I still do not get to allocate my 1.1 GB, and I am at a 
> loss explaining why.
> The help pages for memory() also do not explain what actually happens 
> with the command line options min-vsize and max-vsize, or at least did 
> not prepare me for the following odd observation. If I specify 
> min-vsize=2048M, then I do not get to read in more than 384 MB of data; 
> it is as if the allocation of a minimum of 2 Gig of RAM actually reduces 
> the amount available. If I specify max-vize=2048M, then I can at least 
> read my data, but still run out of memory trying to run my spatial 
> regression. In other words, the specification of a minimum renders my 
> nice machine unusable, whereas the specification of a maximum seems to 
> have not the desired effect of increasing the memory allocation for R.
> So here are my questions:
> 1. What is the purpose of min-vsize and min-nsize? (please do not point 
> me to help pages, I read them)
> 2. Why do I get error messages for min-nsize > 2G even on machines whose 
> OS support more? (I tried the BCDEdit switch suggested by Kingsford 
> Jones but it did not have any visible effect)
> 3. How can I increase vsize within R rather than at startup 
> (memory.limit does not allow to specify types of memory)?
> 4. Why do I still get a "cannot allocate vector of size 1.1 Gb" even 
> when gc() tells me that I have used only 240 out of my 2048 Mb allocated?
> 5. At this point, I am actually using only a small test dataset to 
> develop my models. Once I have a functioning, well-tested model, I want 
> to apply it to a dataset 100 times larger and I know that I will have to 
> move onto a Linux platform. Given that I won't have 400 Gb of RAM 
> available there, will I run into similar problems there or will the OS 
> take care of memory allocation issues using swap space?
> Cheers,
>       Jochen
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>




More information about the R-help mailing list