[R-sig-ME] My lack of responses to questions on the list
bates at stat.wisc.edu
Tue Oct 21 15:38:08 CEST 2008
I do appreciate the questions and the discussions on this email list.
I feel I should explain why I have not been responding rapidly of
late. Martin Maechler and I have been extending the Matrix by adding
Tim Davis's recently released SparseSuiteQR library, which provides a
rank-revealing QR decomposition for sparse matrices. We are also
preparing the Matrix package for its 1.0 release and for its inclusion
as a recommended package for R-2.9.0. That caused somewhat more
modification than we had planned.
I had hoped to be able to use the sparse QR decomposition in
preference to the sparse Cholesky decomposition in the lme4 package.
The approaches are equivalent but in some ways the QR decomposition is
more easily understood. Also, this implementation of the QR can take
advantage of multiple cores in the processor. However, it turns out
that, even with multithreading, for the types of model matrices
encountered in big mixed models the QR is considerably slower than the
Cholesky. This is one of the differences between working on theory
and working on implementations. When you are working on theory you
are done when you finish writing out the equations and perhaps do one
simple example. When working on the implementation you need to
actually do an implementation before you can check whether it is
worthwhile making a change.
There is a new (and, of course, incomplete at present) version of the
"Computational Methods for Mixed Models" vignette in the current lme4
package. It describes a general approach to mixed models based on
PIRLS, a penalized iteratively-reweighted least squares optimization.
I have a branch of the SVN archive at R-forge called "allcoef" where I
am reworking the implementation. (Don't try to install it in its
current form - it's all in pieces right now although I do expect to be
able to reassemble it into a working package). If you are interested
you can poke around a bit there. The objective is to create a common
core for PIRLS optimization in linear, generalized linear, nonlinear
and generalized nonlinear mixed models. That core can then be
included in classes that implement the covariance factors and provide
for different conditional variances of the response.
More information about the R-sig-mixed-models