[R-sig-ME] Very different results from lmer and MCMCglmm
ONKELINX, Thierry
Thierry.ONKELINX at inbo.be
Tue Jan 31 20:43:47 CET 2012
Dear Stuart,
A few remarks on the model itself. You are adding 3 factors both as fixed and random effect. That is not a good idea since they will be competing for exact the same information. Hence the huge CI with the MCMC model.
I'm a bit surprised with the lmer results as well. I would expect to see zero variances for these random effects.
Best regards,
Thierry
________________________________________
Van: r-sig-mixed-models-bounces at r-project.org [r-sig-mixed-models-bounces at r-project.org] namens Stuart Luppescu [slu at ccsr.uchicago.edu]
Verzonden: dinsdag 31 januari 2012 18:51
Aan: r-sig-mixed-models
Onderwerp: [R-sig-ME] Very different results from lmer and MCMCglmm
Hello, I have a dataset with outcomes {1, 2, 3, 4}. The outcome variable
is actually ordered categories, but as point of reference for
comparison, I analyzed it as numeric in lmer, and got these results:
Linear mixed model fit by REML
Formula: rating ~ comp.f + grade.f + subject.f + obsord.f + (1 | obsid)
+ (1 | tid) + (1 | grade.f) + (1 | subject.f) + (1 | obsord.f)
Data: ratings.prin
AIC BIC logLik deviance REMLdev
6886 7058 -3416 6740 6832
Random effects:
Groups Name Variance Std.Dev.
tid (Intercept) 0.19082494 0.436835
obsid (Intercept) 0.10405718 0.322579
subject.f (Intercept) 0.00075553 0.027487
grade.f (Intercept) 0.00075435 0.027465
obsord.f (Intercept) 0.00060346 0.024565
Residual 0.24073207 0.490645
Number of obs: 4253, groups: tid, 245; obsid, 94; subject.f, 5; grade.f,
5; obsord.f, 4
Fixed effects:
Estimate Std. Error t value
(Intercept) 3.261329 0.140592 23.197
comp.f2 -0.095729 0.033461 -2.861
comp.f3 -0.061422 0.033316 -1.844
comp.f4 -0.144613 0.033364 -4.334
comp.f5 -0.059794 0.033599 -1.780
comp.f6 -0.074454 0.033249 -2.239
comp.f7 -0.325454 0.033274 -9.781
comp.f8 -0.186724 0.033187 -5.626
comp.f9 -0.320803 0.033741 -9.508
comp.f10 -0.226328 0.034056 -6.646
grade.f2 -0.203406 0.140249 -1.450
grade.f3 -0.227049 0.134389 -1.689
grade.f4 -0.377642 0.137710 -2.742
grade.f5 -0.225643 0.140196 -1.609
subject.f2 -0.009939 0.053291 -0.187
subject.f3 0.289519 0.061324 4.721
subject.f4 -0.223719 0.107737 -2.077
subject.f5 -0.025963 0.073520 -0.353
obsord.f2 0.004840 0.038436 0.126
obsord.f3 0.112110 0.052707 2.127
obsord.f4 0.156406 0.078614 1.990
These results seem somewhat reasonable to me. But when I analyze the
very same dataset using the same model in MCMCglmm I get very different
results:
glme5 <- MCMCglmm(rating.o ~ comp.f + grade.f + subject.f + obsord.f ,
prior=list(R=list(V=1, fix=1), G=list(G1=list(V=1,
nu=0), G2=list(V=1, nu=0), G3=list(V=1, nu=0), G4=list(V=1, nu=0),
G5=list(V=1, nu=0) )),
random = ~tid + obsid + grade.f + subject.f + obsord.f ,
family = "ordinal",
nitt=100000,
data = ratings.prin)
Iterations = 3001:99991
Thinning interval = 10
Sample size = 9700
DIC: 5701.873
G-structure: ~tid
post.mean l-95% CI u-95% CI eff.samp
tid 2.423 1.821 3.063 2759
~obsid
post.mean l-95% CI u-95% CI eff.samp
obsid 1.521 0.7707 2.331 5227
~grade.f
post.mean l-95% CI u-95% CI eff.samp
grade.f 95365148 2.234e-17 104888830 2296
~subject.f
post.mean l-95% CI u-95% CI eff.samp
subject.f 7.5e+07 1.502e-17 101313849 3950
~obsord.f
post.mean l-95% CI u-95% CI eff.samp
obsord.f 122278523 2.079e-17 64065615 3851
R-structure: ~units
post.mean l-95% CI u-95% CI eff.samp
units 1 1 1 0
Location effects: rating.o ~ comp.f + grade.f + subject.f + obsord.f
post.mean l-95% CI u-95% CI eff.samp pMCMC
(Intercept) 1.430e+02 -2.218e+04 1.781e+04 10178 0.607629
comp.f2 -3.448e-01 -5.854e-01 -1.161e-01 6220 0.004124 **
comp.f3 -2.219e-01 -4.527e-01 1.402e-02 6328 0.064124 .
comp.f4 -5.166e-01 -7.459e-01 -2.831e-01 6454 0.000206 ***
comp.f5 -2.087e-01 -4.431e-01 2.333e-02 6338 0.084536 .
comp.f6 -2.692e-01 -5.091e-01 -4.112e-02 6290 0.024948 *
comp.f7 -1.163e+00 -1.403e+00 -9.395e-01 4027 < 1e-04 ***
comp.f8 -6.682e-01 -9.011e-01 -4.368e-01 5448 < 1e-04 ***
comp.f9 -1.157e+00 -1.392e+00 -9.171e-01 4253 < 1e-04 ***
comp.f10 -8.167e-01 -1.056e+00 -5.742e-01 6152 < 1e-04 ***
grade.f2 -2.417e+00 -7.966e+03 8.888e+03 13314 0.396082
grade.f3 1.304e+02 -7.486e+03 9.484e+03 10314 0.342062
grade.f4 -1.684e+02 -9.879e+03 6.926e+03 12352 0.283711
grade.f5 1.218e+02 -8.380e+03 7.895e+03 8740 0.351546
subject.f2 -9.163e+01 -7.562e+03 7.806e+03 12224 0.930309
subject.f3 1.699e+01 -7.411e+03 8.238e+03 12320 0.344536
subject.f4 3.477e+01 -9.427e+03 7.519e+03 13106 0.372165
subject.f5 -1.203e+02 -7.618e+03 8.837e+03 9071 0.848247
obsord.f2 -5.860e+01 -7.058e+03 5.605e+03 9290 0.819794
obsord.f3 -9.302e+01 -5.852e+03 5.641e+03 7386 0.332990
obsord.f4 -1.243e+02 -6.891e+03 6.093e+03 10073 0.343299
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
Cutpoints:
post.mean l-95% CI u-95% CI eff.samp
cutpoint.traitrating.o.1 3.309 3.101 3.517 172.5
cutpoint.traitrating.o.2 6.790 6.552 7.056 150.1
Obviously, something has gone kablooey here. The confidence intervals
for the grade, subject and obsord random effects range over 25 orders of
magnitude, and the fixed effects are also extremely large (but with
correspondingly large standard errors). The intercept is 143, while the
outcomes only range between 1 and 4. Can anyone tell me what I have
screwed up here?
TIA.
--
Stuart Luppescu -=- slu .at. ccsr.uchicago.edu
University of Chicago -=- CCSR
才文と智奈美の父 -=- Kernel 3.2.1-gentoo-r2
To paraphrase provocatively, 'machine learning is
statistics minus any checking of models and
assumptions'. -- Brian D. Ripley (about the
difference between machine learning and
statistics) useR! 2004, Vienna (May 2004)
_______________________________________________
R-sig-mixed-models at r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-sig-mixed-models
More information about the R-sig-mixed-models
mailing list