[R-sig-ME] multilevel logistic model with MCMCglmm

Jarrod Hadfield j.hadfield at ed.ac.uk
Sun Apr 3 18:02:44 CEST 2011


Hi Ronggui,

Fixing the variance at 1 ADDS 1 to the level-1 error rather than  
replacing pi^2/3.

Cheers,

Jarrod


Quoting Wincent <ronggui.huang at gmail.com> on Sun, 3 Apr 2011 23:59:26 +0800:

> Hi Professor Hadfield, I see.
>
> In the multilevel logistic model, the level-1 error variance is fixed
> pi^2/3, which I perhaps should follow.
>
> Thank you very much
>
> BTW, the MCMCglmm is a great package.
>
> Regards,
> Ronggui
>
> On 3 April 2011 23:53, Jarrod Hadfield <j.hadfield at ed.ac.uk> wrote:
>
>> Hi Wincent,
>>
>> The model you ran is meaningless, because with categorical data or ordinal
>> data the residual variance cannot be estimated from the data. The best
>> option is to fix the residual variance at some value (I use 1):
>>   
>> MC1<-MCMCglmm(use~urban+age+livch,random=~district,data=Contraception,family="categorical",prior=list(R=list(V=1,
>> fix=1), G=list(G1=list(V=1, nu=0))), slice=T)
>>
>> I have left the prior on the district variance flat and improper, and I
>> have used slice sampling (slice=T) instead of Metropolis-Hastings updates
>> because the chain mixes faster:
>>
>> summary(MC1)
>>
>>  Iterations = 3001:12991
>>  Thinning interval  = 10
>>  Sample size  = 1000
>>
>>  DIC: 2391.208
>>
>>
>>  G-structure:  ~district
>>
>>         post.mean l-95% CI u-95% CI eff.samp
>> district    0.3333   0.1407   0.5777    459.9
>>
>>
>>  R-structure:  ~units
>>
>>      post.mean l-95% CI u-95% CI eff.samp
>> units         1        1        1        0
>>
>>
>>  Location effects: use ~ urban + age + livch
>>
>>            post.mean l-95% CI u-95% CI eff.samp  pMCMC
>> (Intercept)  -2.02718 -2.38427 -1.69340    559.7 <0.001 ***
>> urbanY        0.87874  0.61785  1.16036    640.9 <0.001 ***
>> age          -0.03233 -0.05013 -0.01523    531.4 <0.001 ***
>> livch1        1.32248  0.96341  1.66027    578.6 <0.001 ***
>> livch2        1.63986  1.24274  2.05330    542.6 <0.001 ***
>> livch3+       1.61311  1.18382  2.00119    512.2 <0.001 ***
>>
>> Note that you can rescale the estimates to approximate what they would be
>> if the residual variance was zero (i.e. the glmer default):
>>
>> c2<-((16*sqrt(3))/(15*pi))^2
>>
>> posterior.mode(MC1$Sol/sqrt(1+c2))
>> (Intercept)      urbanY         age      livch1      livch2     livch3+
>> -1.69345255  0.72259159 -0.03015856  1.14554895  1.38585993  1.34449154
>>
>> which are closer to the ML estimates.
>>
>> Cheers,
>>
>> Jarrod
>>
>>
>>
>>  Quoting Wincent <ronggui.huang at gmail.com> on Sun, 3 Apr 2011 20:32:27
>> +0800:
>>
>>  Dear list members,
>>>
>>> I fit a random intercept morel with lme4 first, then I tried to fit the
>>> same
>>> model with MCMCglmm. I didn't specify the prior and ran 300000 iterations.
>>> The model still didn't converge as the autocorrelation is very high
>>> (>0.4).
>>> May I ask (1) do I have to specify my own prior? and (2) what is the
>>> usually/average number of iterations in Bayesian multilevel model? 300000
>>> seems quite large a number. Do I have to rerun the model if the previous
>>> one
>>> doesn't converge?
>>>
>>> Best with thanks.
>>>
>>>  data(Contraception,package="mlmRev")
>>>> library(lme4)
>>>> fm1 <-
>>>>
>>>
>>> glmer(use~urban+age+livch+(1|district),data=Contraception,family=binomial,nAGQ=7)
>>>
>>>  MC1 <-
>>>>
>>>
>>> MCMCglmm(use~urban+age+livch,random=~district,data=Contraception,family="categorical",nitt=300000,burnin=240000,thin=100)
>>>
>>>> summary(MC1)
>>>>
>>>
>>>  Iterations = 240001:299901
>>>  Thinning interval  = 100
>>>  Sample size  = 600
>>>
>>>  DIC: 47.28245
>>>
>>>  G-structure:  ~district
>>>
>>>         post.mean l-95% CI u-95% CI eff.samp
>>> district      3242      953     6490      132
>>>
>>>  R-structure:  ~units
>>>
>>>      post.mean l-95% CI u-95% CI eff.samp
>>> units     38025    13640    57500    85.53
>>>
>>>  Location effects: use ~ urban + age + livch
>>>
>>>            post.mean l-95% CI u-95% CI eff.samp  pMCMC
>>> (Intercept)  -197.901 -262.707 -130.966    92.68 <0.002 **
>>> urbanY         86.448   53.374  119.831   131.85 <0.002 **
>>> age            -3.139   -5.439   -1.200   166.36 <0.002 **
>>> livch1        128.027   76.737  177.583   126.78 <0.002 **
>>> livch2        159.752  102.289  222.468   109.28 <0.002 **
>>> livch3+       155.254   92.145  216.916   123.80 <0.002 **
>>> ---
>>> Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
>>>
>>>  autocorr(MC1$Sol)
>>>>
>>> <snip>
>>>
>>> --
>>> Wincent Ronggui HUANG
>>> Sociology Department of Fudan University
>>> PhD of City University of Hong Kong
>>> http://asrr.r-forge.r-project.org/rghuang.html
>>>
>>>        [[alternative HTML version deleted]]
>>>
>>>
>>>
>>
>>
>> --
>> The University of Edinburgh is a charitable body, registered in
>> Scotland, with registration number SC005336.
>>
>>
>>
>
>
> --
> Wincent Ronggui HUANG
> Sociology Department of Fudan University
> PhD of City University of Hong Kong
> http://asrr.r-forge.r-project.org/rghuang.html
>



-- 
The University of Edinburgh is a charitable body, registered in
Scotland, with registration number SC005336.




More information about the R-sig-mixed-models mailing list