[R-sig-ME] Question regarding large data.frame in LMER?

Douglas Bates b@te@ @end|ng |rom @t@t@w|@c@edu
Wed Dec 16 17:16:24 CET 2020


A point that gets lost in these discussions is that there is more to the
calculation than solving, e.g., Henderson's mixed-model equations.  The
method that is used in lme4 evaluates a profiled log-likelihood from the
solution to a sparse penalized least squares problem.  This is described in
our 2015 J. Stat. Soft. paper.

One innovation in the MixedModels.jl is to extend this penalized least
squares problem to a blocked system that incorporates the fixed-effects
model matrix and the response in addition to the random-effects model
matrix.  Again sparsity in the random-effects model matrix is exploited but
using sparse patterns (diagonal or block diagonal) rather than general
sparse matrix techniques.  It turns out that it is not necessary to "solve"
any equations when evaluating the profiled log-likelihood.  It is only
necessary to update the blocked Cholesky factor.  (This isn't as big a win
as it may seem because most of the work in solving for the conditional
estimates of the fixed-effects and the conditional modes of the random
effects is in updating the Cholesky factor).  If you look at the code in
the MixedModels.jl package it is literally a matter of installing a new
value of the covariance parameter (written θ in both lme4 and
MixedModels.jl), updating the Cholesky factor L and adding up logarithms of
elements on the diagonal.

With regard to Harold's point of using Microsoft R, in my experience there
is a big gain in using MKL BLAS on Intel processors relative to using
OpenBLAS which is the default for Julia.  One benchmark I keep running for
myself takes a little over 7 ms. per evaluation on my computer with
OpenBLAS and a little over 4 ms. per evaluation using MKL.  One thing to
note is that the optimum is different when using MKL, in this case.  When
you rearrange the order of operations in the numerical linear algebra in
these multi-threaded BLAS you can get slightly different answers which, in
this case, leads to a different optimum.

I believe that R still ships with the reference, single-threaded BLAS so
the difference in switching to MKL could be even more dramatic on modern
multi-core processors.

The point that I am trying to make here is that the numerical methods
should be judged first on accuracy and reliability of the method and then
on speed to obtain the estimate not just on solving one system of
equations.  One difference between generalized least squares (Henderson's
mixed-model equations) and the penalized least squares approach in
lme4/MixedModels is that GLS is usually written in terms of the precision
matrix (inverse of the covariance matrix) of the random effects whereas we
formulate the PLS approach in terms of the Cholesky factor of the
covariance matrix.  The covariance matrix of the random effects can be and
often is singular at the estimates in over-parameterized models.  In those
cases you can't use the precision matrix because it doesn't exist.


On Tue, Dec 15, 2020 at 5:48 PM David Duffy <
David.Duffy using qimrberghofer.edu.au> wrote:

> I have not done any comparisons versus other programs, but:
>
> https://mran.microsoft.com/package/HMMEsolver
>
> where the algorithm is described in
> Kim (2017) A Fast Algorithm for Solving Henderson's Mixed Model Equation
> https://arxiv.org/pdf/1710.09663
> _______________________________________________
> R-sig-mixed-models using r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-sig-mixed-models
>

	[[alternative HTML version deleted]]



More information about the R-sig-mixed-models mailing list