[R-sig-ME] Specifying a model with a link function in MCMCglmm

Jarrod Hadfield j.hadfield at ed.ac.uk
Fri Feb 19 09:09:03 CET 2016


Hi Sam,

Sorry, I should have said nu=k+1 (not nu=k+2) probably makes the 
marginal prior for the correlation close to uniform over the -1/1 
interval. This was a guess given the properties of the inverse-Wishart 
(see P72 of the CourseNotes) but it seems to hold .... below is a 
function for simulating from a PX prior:

V2<-rpx(10000, V=diag(2), nu=2, alpha.mu=rep(0,2), alpha.V=diag(2)*1000)
V3<-rpx(10000, V=diag(2), nu=2+1, alpha.mu=rep(0,2), alpha.V=diag(2)*1000)

par(mfrow=c(1,2))
hist(V2[,2]/sqrt(V2[,1]*V2[,4]))
hist(V3[,2]/sqrt(V3[,1]*V3[,4]))

Note that in multiparameter models having marginal priors that are flat 
does NOT mean that they will have no influence on the marginal posterior.

Some simulation/analytic work on the multivariate PX prior would be a 
nice thing for someone to do!

Cheers,

Jarrod


rpx<-function(n=1, V, nu, alpha.mu, alpha.V){

k<-nrow(V)
alpha<-MASS::mvrnorm(n, alpha.mu, alpha.V)
Valpha<-MCMCglmm::rIW(V, nu=nu, n=n)

Vp<-sapply(1:n, 
FUN=function(i){diag(alpha[i,])%*%matrix(Valpha[i,],k,k)%*%diag(alpha[i,])})

if(k==1){
return(Vp)
}else{
return(t(Vp))
}
}






On 19/02/2016 07:27, Jarrod Hadfield wrote:
> Hi Sam,
>
> Priors for covariance matrices are tricky but having nu=k (where k is 
> the dimension of the matrix) is not a good idea. I tend to use 
> parameter expanded priors of the form
>
> list(V=diag(k), nu=k, alpha.mu=rep(0,k), alpha.V=diag(k)*1000)
>
> The marginal priors for the variances are scaled F(1,1), but I'm not 
> sure what they are for the covariances/correlation. For correlations I 
> think it would be flatish but with peaks close to -1 and 1. Upping 
> nu=k+2 probably makes the marginal prior for the correlation close to 
> uniform over the -1/1 interval. Unfortunatley there seems to be little 
> or no theoretical work on this prior in a multivariate context: I use 
> it because it *seems* to have nice properties. For exampe, if I fitted 
> a set of random effects which in reality are all zero, then it gives 
> posterior support for the correlation across the range -1/1.
>
> I would also add that I don't attempt to fit k>2 covariance matrices 
> unless my data are up to the task.
>
> Cheers,
>
> Jarrod
>
> k<-2
> Ve<-rIW(diag(k), 10)
> y<-MASS::mvrnorm(100, rep(0,k), Ve)
> rfac<-gl(25,4)
>
> #  the data are simply y~MVN(0, Ve)
>
> my_data<-data.frame(y1=y[,1], y2=y[,2], rfac=rfac)
>
> prior1<-list(R=list(V=diag(k), nu=0), G=list(G1=list(V=diag(k), nu=k, 
> alpha.mu=rep(0,k), alpha.V=diag(k)*1000)))
>
> m1<-MCMCglmm(cbind(y1,y2)~trait-1, random=~us(trait):rfac, 
> rcov=~us(trait):units, data=my_data, 
> prior=prior1,family=rep("gaussian", k))
>
> hist(m1$VCV[,2]/sqrt(m1$VCV[,1]*m1$VCV[,4]), breaks=30)
> # posterior correlation of the random effects: nice
>
> prior2<-list(R=list(V=diag(k), nu=0), G=list(G1=list(V=diag(k), nu=k)))
>
> m2<-MCMCglmm(cbind(y1,y2)~trait-1, random=~us(trait):rfac, 
> rcov=~us(trait):units, data=my_data, 
> prior=prior2,family=rep("gaussian", k))
>
> hist(m2$VCV[,2]/sqrt(m2$VCV[,1]*m2$VCV[,4]), breaks=30)
> # bad - look at the variances too!
>
>
>
>
>
>
>
> On 18/02/2016 17:57, Sam H wrote:
>> Thanks for your response Jarrod! Regarding the prior specification, I 
>> forgot to mention the reason I did that. Originally I specified the 
>> model with no prior, but it kept failing just before the 1000th 
>> iteration with the error message "ill-conditioned G/R structure". I 
>> was not very sure what to do since I don't really have any estimates 
>> to use for a prior, so I figured I would just use estimates produced 
>> by lmer. Should I just set the G structure to an identity matrix, or 
>> does it not really matter since nu is high?
>>
>> On Thu, Feb 18, 2016 at 12:48 AM, Jarrod Hadfield 
>> <j.hadfield at ed.ac.uk <mailto:j.hadfield at ed.ac.uk>> wrote:
>>
>>     Hi Sam,
>>
>>     Each distribution has a fixed link function in MCMCglmm, and an
>>     inverse link function for a Gaussian response would be hard to
>>     implement.
>>
>>     Also, using a REML estimate as a prior, and then having quite a
>>     strong degree of belief parameter is a bit odd. Many, including
>>     myself, would consider this double-dipping.
>>
>>     Cheers,
>>
>>     Jarrod
>>
>>
>>
>>     On 17/02/2016 22:26, Sam H wrote:
>>
>>         Hello all,
>>
>>
>>         Before jumping into my question, let me first briefly explain
>>         my model to
>>         give context. Here is how I currently have specified the model
>>         using
>>         MCMCglmm, first specifying a model in lmer and extracting the
>>         variance
>>         estimates for my G prior:
>>
>>         full_lmer <- lmer(RT ~ AgeGroup*cue*cong + (cue + cong|PID),
>>         data =
>>         vsdataset)
>>
>>         sigma2 <- sigma(full_lmer)^2
>>
>>         Lambda <- getME(full_lmer, "Lambda")
>>
>>         Sigma <- sigma2*tcrossprod(Lambda)
>>
>>         Gmer <- Sigma[1:4,1:4] #Extracting the VCV parameters from the
>>         block
>>         diagonalized Sigma
>>
>>         full_mcmc <- MCMCglmm(RT ~ AgeGroup*cue*cong, random = ~us(1 +
>>         cue +
>>         cong):PID, thin = 10, nitt = 20000, data = vsdataset, prior =
>>         list(G =
>>         list(G1 = list(V = diag(diag(Gmer)), nu = 5))))
>>
>>         Note: I used V = diag(diag(Gmer)) due to the fact that
>>         Sigma/Gmer was not
>>         positive definite, which MCMCglmm would not accept.
>>
>>
>>         Quick explanation of model terms:
>>
>>         RT --> response time in msec, very positively skewed even
>>         after outlier
>>         removal, inverse transform seems to center it
>>
>>         AgeGroup --> 2 level factorial var (Old and Young,
>>         between-subjects)
>>
>>         cue --> 3 level factorial var (within-subjects)
>>
>>         cong --> 2 level factorial var (within-subjects)
>>
>>         PID --> participant ID number
>>
>>
>>         I want to specify this model with an inverse/reciprocal link
>>         function
>>         (Gaussian family). However, I can't figure out how to specify
>>         the link
>>         function. In the help section for the MCMCglmm function, they
>>         mention a
>>         "linking.function" for the random effects terms, but it
>>         doesn't seem to
>>         have anything to with specifying a link function for the
>>         response variable.
>>         According to the course notes from the MCMCglmm package,
>>         "there are many
>>         different types of distribution and link functions and those
>>         supported by
>>         MCMCglmm can be found in Table 7.1." However, Table 7.1 seems
>>         to just list
>>         the families and their PDFs, there's no column listing
>>         "supported" link
>>         functions.
>>
>>         So, how do you specify a link function using MCMCglmm? If you
>>         can't
>>         directly specify a link function, is there something else I
>>         need to do such
>>         as specifying the prior a certain way, or is it valid to just
>>         specify the
>>         model as having a gaussian response and leave the mode as I've
>>         specified
>>         it? After plotting the model, I noticed that several of the
>>         parameter
>>         distributions were extremely skewed (some left, some right).
>>
>>
>>         As a side note, I originally tried two alternatives:
>>
>>         1) using lmer with an inverse transform
>>
>>         2) using glmer with family = gaussian(link = "inverse") and
>>         family =
>>         inverse.gaussian(link = "identity")
>>
>>
>>         #1 seems problematic due to the fact that I need to convert
>>         the response
>>         variable units back to the original units, which not only
>>         flips any
>>         confidence intervals but also makes them uneven. I'm not sure
>>         if converting
>>         these CI's is even appropriate as they were computed with
>>         different
>>         units/distribution. I also don't know of any way to validly
>>         convert the
>>         standard error back since that is certainly not valid once I
>>         back-transform.
>>
>>         #2 gave me some issues: first, I had to scale down RT by a
>>         factor of 1000
>>         (from ms to s) when using gaussian(link = "inverse") otherwise
>>         I would get
>>         an error about the downdated VtV not being positive definite.
>>         But after
>>         dividing RT by 1000, it was able to continue, but the model
>>         did not fully
>>         converge (I think the max abs gradient was approximately .02).
>>         I decided to
>>         rerun the model after changing the contrasts on my variables
>>         from the
>>         default dummy coding to effect coding (using contr.sum). The
>>         same thing
>>         happened, except this time the max gradient was a little
>>         higher (about
>>         .0375) and in addition, I got the "model is nearly
>>         unidentifiable" warning
>>         due to a large eigenvalue. When I ran the model with
>>         inverse.gaussian(link
>>         = "identity"), it worked without scaling down RT by 1000 but I
>>         a bunch of
>>         optimizer warnings so I scaled it down and this time it wasn't
>>         able to
>>         converge because the max abs gradient value was about .0247.
>>
>>
>>         Any help on this would be greatly appreciated!
>>
>>
>>         - Sam
>>
>>                 [[alternative HTML version deleted]]
>>
>>         _______________________________________________
>>         R-sig-mixed-models at r-project.org
>>         <mailto:R-sig-mixed-models at r-project.org> mailing list
>>         https://stat.ethz.ch/mailman/listinfo/r-sig-mixed-models
>>
>>
>>
>>     --     The University of Edinburgh is a charitable body, 
>> registered in
>>     Scotland, with registration number SC005336.
>>
>>
>
>
>
> The University of Edinburgh is a charitable body, registered in
> Scotland, with registration number SC005336.
>
>
> _______________________________________________
> R-sig-mixed-models at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-sig-mixed-models

-------------- next part --------------
An embedded and charset-unspecified text was scrubbed...
Name: not available
URL: <https://stat.ethz.ch/pipermail/r-sig-mixed-models/attachments/20160219/62f45105/attachment.pl>


More information about the R-sig-mixed-models mailing list