[R-sig-ME] logistic regression with glmer, three ways

Malcolm Fairbrother M.Fairbrother at bristol.ac.uk
Fri Nov 27 15:11:13 CET 2015


Thanks to both Thierry and David for their suggestions.

I investigated the issue further, by fitting the same model using MCMCglmm
(which allows for approach A, with cbind, and C, with one Bernoulli trial
per row). The results are below.

The upshot is that the results from both A and C using MCMCglmm are similar
to each other, and to both of A and B using glmer (with Laplace).
Therefore, C with glmer (Laplace) appears to be the exception, and my
conclusion is that David is right that it's about the approximation.

- Malcolm



library(MCMCglmm)

prior <- list(R=list(V=1, nu=0.002), G=list(G1=list(V=1, nu=1, alpha.mu=0,
alpha.V=100), G2=list(V=1, nu=1, alpha.mu=0, alpha.V=100)))
modA.MC <- MCMCglmm(fixed=cbind(X1, X0) ~ xD + xM, random=~wi + be,
data=obj, family="multinomial2", prior=prior)

prior <- list(R=list(V=1, fix=1), G=list(G1=list(V=1, nu=1, alpha.mu=0,
alpha.V=100), G2=list(V=1, nu=1, alpha.mu=0, alpha.V=100)))
modC.MC <- MCMCglmm(fixed=y ~ xD + xM, random=~wi + be, data=objC,
family="categorical", prior=prior)

# note that residual variance is treated differently, so scales are
slightly different

> summary(modA.MC)
...
            post.mean l-95% CI u-95% CI eff.samp pMCMC
(Intercept)    0.9501   0.1047   1.8424   1000.0 0.040 *
xD            -0.0916  -0.4815   0.2895   1000.0 0.636
xM            -0.1903  -0.4585   0.1094    901.9 0.210

> summary(modC.MC)
...
            post.mean l-95% CI u-95% CI eff.samp pMCMC
(Intercept)   1.12976  0.05293  2.16752     1000 0.040 *
xD           -0.07997 -0.54759  0.35903     1000 0.732
xM           -0.22518 -0.57009  0.11479     1000 0.192

> summary(modA.MC$Sol)
...
               Mean     SD
(Intercept)  0.9501 0.4554
xD          -0.0916 0.2028
xM          -0.1903 0.1489

> summary(modC.MC$Sol)
...
                Mean     SD
(Intercept)  1.12976 0.5397
xD          -0.07997 0.2315
xM          -0.22518 0.1756




On 27 November 2015 at 00:11, David Duffy <David.Duffy at qimrberghofer.edu.au>
wrote:

> On Fri, 27 Nov 2015, Malcolm Fairbrother wrote:
>
> Dear all,
>> I am trying to fit a logistic mixed model using lme4, and finding some
>> results I can't understand.
>>
>
> FWIW, similar problems with just (1|wi). glmmML allows one RE, so it
> can be run in a comparison:
>
> A. cbind(X1, X0) ~ xD + xM + (1 | wi)
>
>      glmer (laplace)  glmmML (laplace)
> Int  1.2478 (0.4187)  1.2478 (0.4187)
> xD  -0.3722 (0.3619) -0.3722 (0.3619)
> xW  -0.2720 (0.1306) -0.2720 (0.1306)
>
> C. y ~ xD + xM + (1 | wi)
>
>      glmer (laplace)    glmmM (laplace)
> Int  1.2478 (0.2240)  1.2714 (0.4175)
> xD  -0.3722 (0.2842) -0.3894 (0.3608)
> xW  -0.2720 (0.0732) -0.2794 (0.1302)
>
>
> Most importantly, if you use AGQ instead of Laplace
> the glmer SEs now all match.
>
> glmer (nAGQ=10)
>
>            Estimate Std. Error (Intercept)   1.2478     0.4196
> xD           -0.3722     0.3627
> xM           -0.2720     0.1309
>
>
> | David Duffy (MBBS PhD)
> | email: David.Duffy at qimrberghofer.edu.au  ph: INT+61+7+3362-0217 fax:
> -0101
> | Genetic Epidemiology, QIMR Berghofer Institute of Medical Research
> | 300 Herston Rd, Brisbane, Queensland 4006, Australia  GPG 4D0B994A
>

	[[alternative HTML version deleted]]



More information about the R-sig-mixed-models mailing list