[R] [External] Somewhat disconcerting behavior of seq.int()

Stephanie Evert @te|@nML @end|ng |rom co||oc@t|on@@de
Tue May 3 09:25:46 CEST 2022



> On 3 May 2022, at 07:08, Bert Gunter <bgunter.4567 using gmail.com> wrote:
> 
>> microbenchmark( v1 <- s1 %% 2, times = 50) ## floating point
> Unit: milliseconds
>        expr      min       lq    mean   median       uq      max neval
> v1 <- s1%%2 69.28204 69.60496 69.8957 69.81379 70.01729 71.36125    50
> 
>> microbenchmark( v2 <- s2 %% 2L, times = 50)  ## integer
> Unit: microseconds
>         expr     min      lq     mean   median      uq     max neval
> v2 <- s2%%2L 166.626 167.042 172.7431 170.5215 177.667 194.334    50
> 
> I have no idea why the big difference, but I am pretty sure it's way
> beyond me. Maybe Mac gurus can figure it out. I may post this on
> r-sig-mac to see.

Very likely some inefficiency of the Intel emulator on your M1 mac.  I can imagine it has to do with the substantial differences between Intel and Arm floating-point architectures.

Why not try with a native M1 version of R?

Best,
Stephanie


More information about the R-help mailing list