[R-sig-ME] lme4 (your singularity check with getME())

Ben Bolker bbolker at gmail.com
Thu Oct 16 22:28:44 CEST 2014


On 14-10-16 09:37 AM, Farrar, David wrote:
> 

[I'm taking the liberty of cc'ing the r-sig-mixed-models list, as this
seems to me be a question of general interest]

> Ben, I translate your check as "for those variance parameters not
> bounded away from zero, one or more are below 1e-6."  I have no
> grounds to disagree.  I would very much appreciate if you have a
> moment to give me one or two sentences on why this works, or point me
> in some useful direction, assuming I know a little about linear
> algebra and optimization, e.g., that conditions for convergence can
> be different in constrained and unconstrained problems (1st and 2nd
> order conditions etc.), use of conditions numbers of X'X or nonlinear
> regression analogues, etc. regards, David

   For this check, we haven't (yet) even tried to test for convergence
in the singular case -- we're just checking whether we *are* in the
singular case or not.  The checks we have in place aren't necessarily
sensible in the singular case, although I've found surprisingly few
examples so far where the models are singular and we would deem them to
have converged properly but where the convergence tests fail.  In other
words, in most singular cases the surface appears to be flat and
(half-)convex at the optimum, even though I don't know of any
theoretical reason it would have to be.

  It's on our (long) to-do list to try to work out what the proper
convergence tests would be in a singular case.  Since our singularities
are always simple (i.e. we've hit x_i=0 for one or more parameters
independently/box constraints), we might get away with something as
simple as dropping the singular dimensions out of the gradient and
Hessian, rather than doing something more complicated with Lagrange
multipliers.

  Last bit of information: the reason we can get away with a simple
criterion for singularity (i.e. our variance-covariance matrix is
positive semidefinite but not positive definite, or equivalently some
eigenvalues are exactly zero), which isn't always easy, is that we
parameterize the variance-covariance matrix in terms of its Cholesky
factors; this means we have a singular variance-covariance matrix if and
only if the diagonal elements are equal to zero (we don't let them go
negative).  These parameterizations are nicely explained in Pinheiro and
Bates 1996 _Stat. Comput._  http://dx.doi.org/10.1007/BF00140873 ...

> 
> 
> 
> -----Original Message----- From:
> r-sig-mixed-models-bounces at r-project.org
> [mailto:r-sig-mixed-models-bounces at r-project.org] On Behalf Of Ben
> Bolker Sent: Wednesday, October 15, 2014 3:27 PM To:
> r-sig-mixed-models at r-project.org Subject: [R-sig-ME] lme4
> 
> 
>> Dear Ben Bolker,
>> 
>> Thanks for the quick answer. Yes, I admit that I did not mention my
>>  problem clearly and do apologize for that. It was just because I
>> came across the same error messages in the Qs & As that I did when
>> my package was updated.  I was using the earlier version of lme4
>> and did not have any problems with it. For instance, for the
>> following code I came to the following calculation without any
>> error messages. (my dependent variable is binary and fixed factors
>> are categorical)
>> 
>> summary(mod.15<-glmer(ErrorRate~1+ 
>> cgroup*cgrammaticality*cHeadNoun*cVerbType+(1|itemF)+ 
>> (1+grammaticality*HeadNoun*VerbType|participantF),data=e3, 
>> family="binomial",na.action=na.exclude))
> 
> Note that this is a very large (15*15) random-effects
> variance-covariance matrix to estimate: I know that this is
> recommended by Barr et al 2013, but see recent discussion on this
> list, e.g. 
> http://article.gmane.org/gmane.comp.lang.r.lme4.devel/12492/
> 
> It would be a good idea to check for a singular fit, i.e.
> 
> t <- getME(mod.15,"theta") lwr <- getME(mod.15,"lower") 
> any(t[lwr==0]< 1e-6)
> 
>> Fixed effects: Estimate Std. Error z value Pr(>|z|) (Intercept)
>> -1.9745     0.1274 -15.494  < 2e-16 *** cgroup
>> 1.5843     0.1789   8.854  < 2e-16 *** cgrammaticality
>> 0.5245     0.2182   2.404   0.0162 * cHeadNoun
>> -0.2720     0.1853  -1.468   0.1422 cVerbType
>> 0.7591     0.2326   3.263   0.0011 ** cgroup:cgrammaticality
>> 1.5796     0.3586   4.404 1.06e-05 *** cgroup:cHeadNoun
>> 0.0475     0.3537   0.134   0.8932 cgrammaticality:cHeadNoun
>> 0.5368     0.4338   1.237   0.2159 cgroup:cVerbType
>> -0.2441     0.3472  -0.703   0.4821 cgrammaticality:cVerbType
>> -0.4861     0.4185  -1.162   0.2454 cHeadNoun:cVerbType
>> -0.1563     0.3969  -0.394   0.6936 
>> cgroup:cgrammaticality:cHeadNoun             0.2659     0.7161
>> 0.371   0.7104 cgroup:cgrammaticality:cVerbType            -0.4691
>> 0.6945  -0.675   0.4994 cgroup:cHeadNoun:cVerbType
>> 0.7661     0.6916   1.108   0.2679 
>> cgrammaticality:cHeadNoun:cVerbType          0.9104     0.9147
>> 0.995   0.3196 cgroup:cgrammaticality:cHeadNoun:cVerbType   3.1326
>> 1.3994   2.239   0.0252 *
>> 
> These estimated effects look only very slightly different to me than
> the ones below (i.e., only a few percent differences in point
> estimates, always much smaller than the estimated standard error, and
> no qualitative differences in Z/P values).  Can you specify whether
> there are any differences that particularly concern you?
> 
>> 
>> But as soon as I updated the package to a new version , for the
>> same code I got the following error message and some calculations
>> are not matched with the those with the earlier version (as shown
>> below). I don't know exactly which version was the previous one,
>> but I guess I was using 2013 packages
>> 
>> Warning messages: 1: In commonArgs(par, fn, control, environment())
>> : maxfun < 10 * length(par)^2 is not recommended.
>> 
>> Relatively harmless
>> 
>> 2: In optwrap(optimizer, devfun, start, rho$lower, control =
>> control,  : convergence code 1 from bobyqa: bobyqa -- maximum
>> number of function evaluations exceeded 3: In (function (fn, par,
>> lower = rep.int(-Inf, n), upper = rep.int(Inf,  : failure to
>> converge in 10000 evaluations
>> 
> You definitely need to increase the number of iterations: see
> ?lmerControl, specifically the "optCtrl" setting (e.g. 
> control=lmerControl(optCtrl=list(maxfun=1e6)))
> 
>> 4: In checkConv(attr(opt, "derivs"), opt$par, ctrl =
>> control$checkConv,  : Model failed to converge with max|grad| =
>> 0.0928109 (tol = 0.001, component 28) 5: In checkConv(attr(opt,
>> "derivs"), opt$par, ctrl = control$checkConv,  : Model failed to
>> converge: degenerate  Hessian with 1 negative eigenvalues
>> 
> These are convergence *warnings*.  They do not indicate that your fit
> is actually any worse than previously, just that we have increased
> the sensitivity of the tests.  Can you specify what version you are
> using?
> 
> I wouldn't recommend moving back to an earlier version of lme4, but
> you could check out
> https://github.com/lme4/lme4/blob/master/README.md for instructions
> on how to install the lme4.0 package if you really want ...
> 
>> Fixed effects: Estimate Std. Error z value Pr(>|z|) (Intercept)
>> -1.97217    0.13810 -14.281  < 2e-16 *** cgroup
>> 1.58614    0.18262   8.686  < 2e-16 *** cgrammaticality
>> 0.52725    0.24544   2.148   0.0317 * cHeadNoun
>> -0.28061    0.21350  -1.314   0.1887 cVerbType
>> 0.75503    0.25615   2.948   0.0032 ** cgroup:cgrammaticality
>> 1.57010    0.36695   4.279 1.88e-05 *** cgroup:cHeadNoun
>> 0.05736    0.36138   0.159   0.8739 cgrammaticality:cHeadNoun
>> 0.55238    0.47616   1.160   0.2460 cgroup:cVerbType
>> -0.24665    0.35618  -0.692   0.4886 cgrammaticality:cVerbType
>> -0.49272    0.45732  -1.077   0.2813 cHeadNoun:cVerbType
>> -0.14235    0.44553  -0.319   0.7493 
>> cgroup:cgrammaticality:cHeadNoun            0.24468    0.73223
>> 0.334   0.7383 cgroup:cgrammaticality:cVerbType           -0.45695
>> 0.70627  -0.647   0.5176 cgroup:cHeadNoun:cVerbType
>> 0.75837    0.70763   1.072   0.2839 
>> cgrammaticality:cHeadNoun:cVerbType         0.88375    0.98856
>> 0.894   0.3713 cgroup:cgrammaticality:cHeadNoun:cVerbType  3.15344
>> 1.42351   2.215   0.0267 *
>> 
>> Because I am using Rstudio I have just two options when I want to 
>> install or to update packages. When I use CRAN and let Rstudio
>> install lme4 automatically it installs the most recent one. As
>> such, it downloads the new package of lme4 which is problematic as
>> I understand (sorry I might be wrong for that because I don't have
>> any expertise but I'm talking from my observations) So my
>> suggestion is that let the earlier version of lme4 be on the CRAN
>> such that when users are installing they automatically install the
>> one which was not problematic.  Another option for me to download
>> the earlier version and to install from my pc. But when I use this
>> option from Rstudio, lme4 does not install and come with the
>> following message.
>> 
>> install.packages("~/lme4_1.0-4.tar.gz", repos = NULL, type =
>> "source") Installing package into
>> 'C:/Users/Azad/Documents/R/win-library/3.0' (as 'lib' is
>> unspecified) * installing *source* package 'lme4' ... ** package
>> 'lme4' successfully unpacked and MD5 sums checked ** libs
> 
> If you want to install 1.0-4 you can either get the tarball from
> here: 
> http://cran.r-project.org/src/contrib/Archive/lme4/lme4_1.0-4.tar.gz
> 
> but you will either need to be able to install it from source (i.e. 
> have compilers etc. installed) or modify the DESCRIPTION file to make
> yourself the maintainer and ship it off to
> ftp://win-builder.r-project.org.
> 
> *OR* (possibly a better idea) you can retrieve a binary/.zip file
> from
> 
> http://lme4.r-forge.r-project.org/repos/bin/windows/contrib/3.0/lme4_1.0-4.zip
>
>  and install it.
> 
> (You didn't specify your actual error messages from the attempted 
> lme4 installation.)
> 
>> Sorry for the inconvenience and hope that  I've made things clear
>> now. Best wishes Ebrahim
>> 
> 
> _______________________________________________ 
> R-sig-mixed-models at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-sig-mixed-models
>



More information about the R-sig-mixed-models mailing list