[R] Memory problems with mice

Lorenz, Jennifer jlorenz at uni-goettingen.de
Tue Oct 27 13:20:32 CET 2015


Hi everyone,

I am trying to perform a multiple imputation with mice on a dataset of about 13000 observations and 178 variables. I can start an "empty" imputation ("imp_start <- mice(data, maxit=0)"), but after a few minutes R stops with the following error message:
Error: cannot allocate vector of size 2.0 Gb
In addition: Warning messages:
1: In col(value) :
  Reached total allocation of 8078Mb: see help(memory.size)
2: In col(value) :
  Reached total allocation of 8078Mb: see help(memory.size)
3: In unique.default(x) :
  Reached total allocation of 8078Mb: see help(memory.size)
4: In unique.default(x) :
  Reached total allocation of 8078Mb: see help(memory.size)
5: In unique.default(x) :
  Reached total allocation of 8078Mb: see help(memory.size)
6: In unique.default(x) :
  Reached total allocation of 8078Mb: see help(memory.size)

I am using R (with R Studio) on Windows 7, my computer has a total of 8 GB RAM.
Following advice I found on the internet, I installed the newest R-version (64-bit) and tried the following commands:
memory.size(max = F)
memory.limit(size=NA)

But I still get the same error. Now I am wondering, if my dataset is really to large for mice or R? Or is there any other setting I could use to increase memory size for R? I would really appreciate if anyone who is familiar with this problem would share their insights...

Thanks,
Jen


---
Jennifer Lorenz, M.A.
Georg-August-Universität Göttingen
Sozialwissenschaftliche Fakultät
Institut für Erziehungswissenschaft
Lehrstuhl Schulpädagogik / Empirische Schulforschung

e-mail: jlorenz at uni-goettingen.de<mailto:jlorenz at uni-goettingen.de>
phone: 0551-39-21411
adress: Waldweg 26, 37073 Göttingen
room: 8.106


	[[alternative HTML version deleted]]



More information about the R-help mailing list