[R-SIG-Mac] Compiler options for R binary

David Winsemius dwinsemius at comcast.net
Thu Nov 20 22:19:45 CET 2014


On Nov 20, 2014, at 8:17 AM, Braun, Michael wrote:

> I run R on a recent Mac Pro (Ivy Bridge architecture), and before that, on a 2010-version (Nehalem architecture).  For the last few years I have been installing R by compiling from source.  The reason is that I noticed in the etc/Makeconf file that the precompiled binary is compiled with the -mtune=core2 option.  I had thought that since my system uses a processor with a more recent architecture and instruction set, that I would be leaving performance on the table by using the binary.
> 
> My self-compiled R has worked well for me, for the most part. But sometimes little things pop-up, like difficulty using R Studio, an occasional permissions problem related to the Intel BLAS, etc.  And there is a time investment in installing R this way.  So even though I want to exploit as much of the computing power on my desktop that I can, now I am questioning whether self-compiling R is worth the effort.
> 
> My questions are these:
> 
> 1.  Am I correct that the R binary for Mac is tuned to Core2 architecture?  
> 2.  In theory, should tuning the compiler for Sandy Bridge (SSE4.2, AVX instructions, etc) generate a faster R?
> 3.  Has anyone tested the theory in Item 2?
> 4.  Is the reason for setting -mtune=core2 to support older machines?  If so, are enough people still using pre-Nehalem 64-bit Macs to justify this?

I use an early 2008 MacPro (Lion, so to go to Yosemite, currently with R's SL branch) and a 2009 MacbookPro (Yosemite). (After consulting Wikipedia's pages on Mac processors I'm not sure that pre- and post-Nehalem is sufficiently clear for all platforms that your question can be answered with clarity. It also appears to me that all Macbooks were core2 even now, and if so I think you would get a lot of complaints by making them incompatible with the base version of R. If I'm reading those pages correctly my 15 inch MBP from 2009 is Lynnfield.)

> 5.  What would trigger a decision to start tuning the R binary for a more advanced processor?
> 6.  What are some other implications of either self-compiling or using the precompiled binary that I might need to consider?  
> 
> tl;dr:  My Mac Pro has a Ivy Bridge processor.  Is it worthwhile to compile R myself, instead of using the binary?
> 
> Thanks,
> 
> Michael
> 
> 
> --------------------------
> Michael Braun
> Associate Professor of Marketing
> Cox School of Business
> Southern Methodist University
> Dallas, TX 75275
> braunm at smu.edu
> 
> _______________________________________________
> R-SIG-Mac mailing list
> R-SIG-Mac at r-project.org
> https://stat.ethz.ch/mailman/listinfo/r-sig-mac

David Winsemius
Alameda, CA, USA



More information about the R-SIG-Mac mailing list