[R-sig-ME] Caution - Big changes in lme4 on Saturday
bates at stat.wisc.edu
Thu Jun 19 16:42:19 CEST 2008
As many on the list will have noticed, I suffer from a bad case of
"the best is the enemy of the good", a sort of adult-onset attention
deficit disorder, and have trouble completing projects.
I have been juggling development of the lme4 package under a new
formulation of the computing methods with maintenance of the older
formulation so as not to break some code in other packages. As I am
facing a deadline in preparing the slides for a workshop, on Saturday,
come hell or high water (and right now "high water" is more than a
metaphor around here), I will release the development version of lme4
There are many aspects of the new version that are superior to the
current CRAN version of lme4.
The underlying algorithms for linear mixed models are more stable; in
particular, singular variance-covariance matrices for the random
effects are handled gracefully. The algorithms are simplified; ECME
iterations and the optional evaluation of the gradient have been
eliminated. The only control option now active is msVerbose and that
can be replaced by with the "verbose" argument. There is no
lmer/lmer2 distinction. You can still call lmer2 but the effect is to
turn around and call lmer. You can specify a family argument in a
call to lmer. If the argument is other than the default gaussian
family then glmer gets called transparently.
The algorithms for fitting generalized linear mixed models and
nonlinear mixed models have also been simplified. Direct optimization
of the Laplace approximation is the default (and only) method at
present. Bin Dai is working on adding adaptive Gauss-Hermite
quadrature for models for which it makes sense.
The mcmcsamp function has changed. The good news is that I have
managed to convince myself that the current implementation is a very
good way to do things. The bad news is that the results are not
plug-compatible with the results of the older version. Also, this
version hasn't been tested as much as I would have liked it to be
before a release. I think the basic methodology is sound but I am not
sure at this point if the results are as correct as they could be.
The big change is in the method of sampling from the distribution of
the variance components. I have used an approach in the spirit of Box
and Tiao's discussion of variance components in their book on Bayesian
inference and somewhat in the flavor of the 2008 Gelman et al. paper
in JCGS. I sample from the conditional distribution of a parameter
that takes a value on the entire real line and which coincides with
the variance component on the non-negative values. When this
parameter is negative the variance component is zero. I'm sure there
will be interesting ways of relating the proportion of zeros in the
chain to tests of the variance component being zero versus greater
than zero. I still need to add code for sampling from the conditional
distribution of covariance parameters.
Please be cautious about installing the new version if you depend on
using mcmcsamp in its current form. It would be a good idea to keep a
backup copy of the current CRAN version of the lme4 package in case
you find you want to back out the change.
I regret any inconvenience that these changes will cause. I am
convinced, however, that it will be short-term pain for a long-term
More information about the R-sig-mixed-models