[R-sig-ME] MCMCglmm variance estimates Poisson distribution

Jarrod Hadfield j.hadfield at ed.ac.uk
Wed Mar 5 09:17:56 CET 2014


Hi John,

The two models are different. The obs variable in the lmer model does  
not have a unique level for each observation:

obs: 1607, groups: obs, 168

Cheers,

Jarrod


Quoting "Hodsoll, John" <john.hodsoll at kcl.ac.uk> on Wed, 5 Mar 2014  
07:18:39 +0000:

> Hi Jarrod
>  >  is there a reason that the data frames differ in each (uc11 and uc12)?
> Yes. Two different baseline conditions, cut and paste error.
>
> With the same data frame
>
> summary(mcmc.c11.cf2)
>  Iterations = 10001:99911
>  Thinning interval  = 90
>  Sample size  = 1000
>
>  DIC: 7489.396
>
>  R-structure:  ~units
>
>       post.mean l-95% CI u-95% CI eff.samp
> units    0.7111   0.6276   0.7912     1000
>
>  Location effects: totflct ~ 1
>
>             post.mean l-95% CI u-95% CI eff.samp  pMCMC
> (Intercept)     1.211    1.160    1.263     1000 <0.001 ***
> ---
> Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
>
>> posterior.mode(exp(mcmc.c11.cf$Sol+mcmc.c11.cf$VCV/2))
> (Intercept)
>      4.7921
>
>> exp(mean(mcmc.c11.cf2$Sol)+mean(mcmc.c11.cf2$VCV/2))
> [1] 4.793908
>
> And for glmer...
> Generalized linear mixed model fit by the Laplace approximation
> Formula: totflct ~ (1 | obs)
>    Data: uc11
>   AIC  BIC logLik deviance
>  6205 6216  -3100     6201
> Random effects:
>  Groups Name        Variance Std.Dev.
>  obs    (Intercept) 0.083672 0.28926
> Number of obs: 1607, groups: obs, 168
>
> Fixed effects:
>             Estimate Std. Error z value Pr(>|z|)
> (Intercept)  1.50622    0.02535   59.43   <2e-16 ***
> ---
> Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
> __
>
> for which exp(1.50622 + 0.083672/2) = 4.70232
>
> I take your point re prior. Number of observations is 1607 so I  
> thought this should be sufficient to limit the influence of the prior?
>
> Cheers
> John
> ______________________________________
> From: Jarrod Hadfield <j.hadfield at ed.ac.uk>
> Sent: 04 March 2014 17:13
> To: Hodsoll, John
> Cc: 'r-sig-mixed-models at r-project.org'
> Subject: RE: [R-sig-ME] MCMCglmm variance estimates Poisson distribution
>
> Hi John,
>
> Perhaps the output from:
>
> summary(mcmc.c11.cf2)
>
> and
>
> summary(re.uc12.cf)
>
> will shed some light?  Also is there a reason that the data frames
> differ in each (uc11 and uc12)?
>
> Failing something `obvious' then it must be the prior. How many
> observations is this based on?
>
> Cheers,
>
> Jarrod
>
>
> Quoting "Hodsoll, John" <john.hodsoll at kcl.ac.uk> on Tue, 4 Mar 2014
> 16:40:11 +0000:
>
>> Dear Jarrod
>>
>> Thanks for your reply. I was using 4, but all give a similar answer
>>
>> 1. 4.796014
>>
>> 2. 4.792395
>>
>> 3. 4.798754
>>
>> 4  4.790677
>>
>> As I said I'm not sure I'm missing something obvious?
>>
>> Cheers
>> John
>>
>>
>> -----Original Message-----
>> From: Jarrod Hadfield [mailto:j.hadfield at ed.ac.uk]
>> Sent: 04 March 2014 12:07
>> To: Hodsoll, John
>> Cc: 'r-sig-mixed-models at r-project.org'
>> Subject: Re: [R-sig-ME] MCMCglmm variance estimates Poisson distribution
>>
>> Dear John,
>>
>> How are you calculating the posterior expectation:
>>
>> 1/
>> posterior.mode(exp(mcmc.c11.cf2$Sol+mcmc.c11.cf2$VCV/2))
>> 2/
>> mean(exp(mcmc.c11.cf2$Sol+mcmc.c11.cf2$VCV/2))
>> 3/
>> exp(posterior.mode(mcmc.c11.cf2$Sol)+posterior.mode(mcmc.c11.cf2$VCV/2))
>> 4/
>> exp(mean(mcmc.c11.cf2$Sol)+mean(mcmc.c11.cf2$VCV/2))
>>
>> If it is not by method 1/ try that and see if there is less of a  
>> discrepancy.
>>
>> Cheers,
>>
>> Jarrod
>>
>> Quoting "Hodsoll, John" <john.hodsoll at kcl.ac.uk> on Tue, 4 Mar 2014
>> 11:25:17 +0000:
>>
>>> Dear list
>>>
>>> I'm trying to get some unadjusted estimates and 95% CI for a set of
>>> correlated count data (due to repeated measures on the same cluster) .
>>> To do this I was trying to run an over-dispersed poisson model using a
>>> glmer and MCMCglmm.
>>>
>>> I want to use MCMCglmm as that's the package I wish to use for my main
>>> analysis. However, it seems to over-estimate the variance meaning that
>>> the mean value I get from the intercept only model y = XB + Var/2 (ch2
>>> jarrod hadfield's course notes) is slightly greater than the actual
>>> mean. For example, if I fit the model
>>>
>>> priortr <- list(R=list(V=1, nu=0.001))
>>>
>>> mcmc.c11.cf2 <- MCMCglmm(totflct ~ 1, family="poisson", prior =
>>> priortr, data=uc11,
>>>                          nitt = 100000, burnin = 10000, thin = 90)
>>> summary(mcmc.c11.cf2)
>>>
>>> I'm ignoring the random effect and assuming the additive
>>> over-dispersion term will capture all the extra variance. For a count
>>> rate of 4.69 in the data I get 4.79 and for a count of 5.2 I get 5.52.
>>> On the other hand, if I use glmer including a per observation random
>>> effect I get the correct means
>>>
>>> re.uc12.cf <- glmer(totflct ~ (1|obs), family=poisson, data=uc12)
>>> summary(re.uc12.cf)
>>>
>>> Is there something I missing here?
>>>
>>> Regards
>>> John Hodsoll
>>>
>>> _______________________________________________
>>> R-sig-mixed-models at r-project.org mailing list
>>> https://stat.ethz.ch/mailman/listinfo/r-sig-mixed-models
>>>
>>>
>>
>>
>>
>> --
>> The University of Edinburgh is a charitable body, registered in
>> Scotland, with registration number SC005336.
>>
>>
>>
>>
>
>
>
> --
> The University of Edinburgh is a charitable body, registered in
> Scotland, with registration number SC005336.
>
>
>



-- 
The University of Edinburgh is a charitable body, registered in
Scotland, with registration number SC005336.



More information about the R-sig-mixed-models mailing list