[R] MCMCglmm and iteration
Ben Bolker
bbolker at gmail.com
Wed Feb 17 03:32:56 CET 2016
Rémi Lesmerises <remilesmerises <at> yahoo.ca> writes:
> Hi everyone, I'm running a bayesian regression using the package
> MCMCglmm (Hadfield 2010) and to reach a normal posterior
> distribution of estimates, I increased the number of iteration as
> well as the burnin threshold. However, it had unexpected
> outcomes. Although it improved posterior distribution, it also
> increased dramatically the value of estimates and decrease DIC.
> Here's an example:
head(spring)
pres large_road small_road cab
0 2011 32 78
1 102 179 204
0 1256 654 984
1 187 986 756
0 21 438 57
1 13 5 439
# pres is presence/absence data and other variable are distance to these
features
# with 200,000 iteration and 30,000 burnin
prior <- list(R = list(V = 1, nu=0.002))
sp.simple <- MCMCglmm(pres ~ large_road + cab + small_road,
family = "categorical", nitt = 200000, thin = 200,
burnin = 30000, data = spring, prior = prior, verbose = FALSE, pr = TRUE)
------------
(1) you will do much better with this kind of question on r-sig-mixed-models.
(2) it looks like your chain is mixing very, very badly. If I'm reading
the output correctly, it looks like your effective sample sizes for the
first run (200K iterations) are 1-3 (!) -- you should be aiming for
effective sample sizes of 100s to 1000s. Even with a million iterations
you're only getting up to effective sample sizes of ~150 for some
parameters. I would recommend (a) centring and scaling your parameters
to improve mixing and (b) cross-checking with a different method
(e.g. lme4 or glmmADMB) to make sure you're in the right ballpark.
You shouldn't necessarily expect a Normal posterior as you increase
the number of iterations; the posterior distributions are only
asymptotically Normal as the number of *observations* increases.
More information about the R-help
mailing list