[R] memory problem

Duncan Murdoch murdoch at stats.uwo.ca
Thu Jul 14 14:16:48 CEST 2005


On 7/14/2005 7:19 AM, Ginters wrote:
> I'm a beginner in R and, therefore, I don't know how serious my trouble is.
> After running a script:
> 
>  **
> 
> *t**<-c(14598417794,649693)*
> 
> *data**=data.frame(read.spss("C:\\Ginters\\Kalibracija\\cal_data.sav"))*
> 
> *Xs=**as.matrix(data[,1:2])
> * 
> 
> *koef**=data.frame(read.spss("C:\\Ginters\\Kalibracija\\f.sav"))
> * 
> 
> *piks=**as.matrix(koef[,1**])*
> 
> *g=regressionestimator(Xs,piks,t)*
> 
> I get:
> 
> *Error: cannot allocate vector of size 1614604 Kb
> In addition: Warning messages:
> 1: Reached total allocation of 255Mb: see help(memory.size) 
> 2: Reached total allocation of 255Mb: see help(memory.size) *
> 
> My OS is Win 2000 Proffesional.
> Those 2 objects are of sizes
> 
> *> object.size(Xs)
> [1] 805404
>> object.size(piks)
> [1] 115128*
> 
> accordingly. The 2 files use only 142KB and 60KB accordingly.
> Why does memory need so much (1.6 GB) space? How can I enlarge it? Is it 
> possible to allocate a part of memory used to the hard drive? Or, is the 
> trouble only with my script?

This sounds like a problem with the regressionestimator function, which 
I think comes from the sampling package.  You'll need to contact the 
maintainer of the package to find out why it needs so much memory, and 
whether there's a way to get what you want without it.

Duncan Murdoch




More information about the R-help mailing list