[R] Error: cannot allocate vector of size... but with a twist

Liaw, Andy andy_liaw at merck.com
Fri Jan 28 14:44:46 CET 2005


Just a couple of remarks below...

> From: James Muller
> 
> Hi,
> 
> I have a memory problem, one which I've seen pop up in the list a few 
> times, but which seems to be a little different. It is the 
> Error: cannot 
> allocate vector of size x problem. I'm running R2.0 on RH9.
> 
> My R program is joining big datasets together, so there are lots of 
> duplicate cases of data in memory. This (and other tasks) prompted me 
> to... expand... my swap partition to 16Gb. I have 0.5Gb of 
> regular, fast 
> DDR. The OS seems to be fine accepting the large amount of 
> memory, and 
> I'm not restricting memory use or vector size in any way.
> 
> R chews up memory up until the 3.5Gb area, then halts. Here's 
> the last 
> bit of output:
> 
>  > # join the data together
>  > cdata01.data <- 
> cbind(c.1,c.2,c.3,c.4,c.5,c.6,c.7,c.8,c.9,c.10,c.11,c.12,c.13,
> c.14,c.15,c.16,c.17,c.18,c.19,c.20,c.21,c.22,c.23,c.24,c.25,c.
> 26,c.27,c.28,c.29,c.30,c.31,c.32,c.33)
> Error: cannot allocate vector of size 145 Kb
> Execution halted
> 
> 145--Kb---?? This has me rather lost. Maybe on overflow of 
> some sort?? 
> Maybe on OS problem of some sort? I'm scratching here.
> 
> Before you question it, there is a legitimate reason for sticking all 
> these components in the one data.frame.

One possible way to get around this, if you really have no alternatives, is
to write the individual columns (I assume that's what those things you're
cbind()ing are) to files, and use `paste' to paste them into one file, and
read that into a fresh R session.
 
> One of the problems here is that tinkering is not really 
> feasible. This 
> cbind took 1.5 hrs to finally halt.

That's the price you pay for using your HDD as memory!

Andy
 
> Any help greatly appreciated,
> 
> James




More information about the R-help mailing list