[Rd] Great overhead for setTimeLimit?
Jiefei Wang
@zwj|08 @end|ng |rom gm@||@com
Mon Dec 6 18:23:26 CET 2021
Hi all,
>From the document of 'setTimeLimit', it states "Setting any limit has
a small overhead – well under 1% on the systems measured.", but
something is wrong with my benchmark code, enabling the time limit
makes my benchmark 1x slower than the benchmark without the limit.
Below is an example
```
benchFunc <- function(x, data) {
value <- 0
for(i in 1:5000){
for(j in seq_along(data))
value <- value + data[j]
}
value
}
data <- sample(1:10, 10)
setTimeLimit(Inf, Inf, FALSE)
system.time(lapply(1:5000, benchFunc, data = data))
setTimeLimit(999, 999, FALSE)
system.time(lapply(1:5000, benchFunc, data = data))
```
Here are the test results
> setTimeLimit(Inf, Inf, FALSE)
> system.time(lapply(1:5000, benchFunc, data = data))
user system elapsed
10.809 0.006 10.812
> setTimeLimit(999, 999, FALSE)
> system.time(lapply(1:5000, benchFunc, data = data))
user system elapsed
13.634 6.478 20.106
As a side note, it looks like the GC consumes the most CPU time. The
GC costs 10 secs without the time limit, but 19 secs with the limit.
Any thoughts?
Best,
Jiefei
More information about the R-devel
mailing list