[R] Cannot allocate large vectors (running out of memory?)
Ronnen Levinson
RML27 at cornell.edu
Mon Mar 24 22:39:55 CET 2008
Hi.
As shown in the simplified example below, I'm having trouble allocating
memory for large vectors, even though it would appear that there is more
than enough memory available. That is, even with a memory limit of 1500 MB,
R 2.6.1 (Win) will allocate memory for a first vector of 285 MB, but not for
a second vector of the same size. Forcing garbage collection does not seem
to solve the problem.
Can anyone explain why is this happening, and how to fix it?
Thanks,
Ronnen.
P.S. E-mailed CCs of posted replies would be appreciated.
> rm(list=ls(all=TRUE))
> gc()
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 143465 3.9 350000 9.4 350000 9.4
Vcells 88573 0.7 50380943 384.4 131023877 999.7
> memory.limit()
[1] 1535.875
> n <- 8640 * 4320
> x=rep(1/3, n)
> memory.size()
[1] 578.8543
> gc()
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 143471 3.9 350000 9.4 350000 9.4
Vcells 37413375 285.5 78720219 600.6 131023877 999.7
> y=rep(1/7, n)
Error: cannot allocate vector of size 284.8 Mb
> memory.size()
[1] 578.8543
> gc()
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 143471 3.9 350000 9.4 350000 9.4
Vcells 37413375 285.5 78720219 600.6 131023877 999.7
> version
_
platform i386-pc-mingw32
arch i386
os mingw32
system i386, mingw32
status
major 2
minor 6.1
year 2007
month 11
day 26
svn rev 43537
language R
version.string R version 2.6.1 (2007-11-26)
--
Ronnen Levinson, Ph.D.
scientist, Lawrence Berkeley National Lab
The Onion horoscope: Pisces February 19 - March 20 You will soon be
unwillingly forced into a flurry of activity when you are chosen to host the
2014 Winter Olympiad.
More information about the R-help
mailing list