[R] asking for large memory - crash running rterm.exe on windows
Martin Maechler
maechler at stat.math.ethz.ch
Sat May 28 18:38:54 CEST 2016
>>>>> Ben Bolker <bbolker at gmail.com>
>>>>> on Sat, 28 May 2016 15:42:45 +0000 writes:
> Anthony Damico <ajdamico <at> gmail.com> writes:
>>
>> hi, here's a minimal reproducible example that crashes my
>> R 3.3.0 console on a powerful windows server. below the
>> example, i've put the error (not crash) that occurs on R
>> 3.2.3.
>>
>> should this be reported to http://bugs.r-project.org/ or
>> am i doing something silly? thanx
> From the R FAQ (9.1):
> If R executes an illegal instruction, or dies with an
> operating system error message that indicates a problem in
> the program (as opposed to something like “disk full”),
> then it is certainly a bug.
> So you could submit a bug report, *or* open a discussion
> on r-devel at r-project.org (which I'd have said was a more
> appropriate venue for this question in any case) ...
Indeed.
In this case, this is a known problem -- not just of R, but of
many programs that you can run ---
You are requesting (much) more memory than your computer has
RAM, and in this situation -- depending on the OS ---
your computer will kill R (what you saw) or your it will become
very slow trying to shove all memory to R and start swapping
(out to disk other running / sleeping processes on the
computer).
Both is very unpleasant...
But it is you as R user who asked R to allocate an object of
about 41.6 Gigabytes (26 * 1.6, see below).
As Ben mentioned this may be worth a discussion on R-devel ...
or you rather follow up the existing thread opened by Marius
Hofert three weeks ago, with subject
"[Rd] R process killed when allocating too large matrix (Mac OS X)"
--> https://stat.ethz.ch/pipermail/r-devel/2016-May/072648.html
His simple command to "crash R" was
matrix(0, 1e5, 1e5)
which for some of use gives an error such as
> x <- matrix(0, 1e5,1e5)
Error: cannot allocate vector of size 74.5 Gb
but for others it had the same effect as your example.
BTW: I repeat it here in a functionalized form with added
comments which makes apparent what's going on:
## Make simple data.frame
mkDf <- function(grpsize, wrongSize = FALSE) {
ne <- (if(wrongSize) 26 else 1) *grpsize
data.frame(x = rep(LETTERS, each = ne),
v = runif(grpsize*26), stringsAsFactors=FALSE)
}
g1 <- ceiling(10^5/26)
d1 <- mkDf(g1) # works fine
str(d1)
## 'data.frame': 100022 obs. of 2 variables:
dP <- mkDf(g1, wrong=TRUE)# mis-matching the number of elements
str(dP) # is 26 times larger
## 'data.frame': 2600572 obs. of 2 variables: .....
# make this much bigger
gLarge <- ceiling(10^8/26)
dL <- mkDf(gLarge) # works "fine" .. (well, takes time!!)
str(dL)
## 'data.frame': 100000004 obs. of 2 variables:
as.numeric(print(object.size(dL)) / 1e6)
## 1600002088 bytes
## [1] 1600.002 Mega i.e., 1.6 GBytes
## Well, this will be 26 times larger than already large ==> your R may crash *OR*
## your computer may basically slow down to a crawl, when R requests all its memory...
if(FALSE) ## ==> do *NOT* evaluate the following lightly !!
dLL <- mkDf(gLarge, wrong=TRUE)
# CONSOLE CRASH WITHOUT EXPLANATION
# C:\Users\AnthonyD>
More information about the R-help
mailing list