[R-sig-ME] Subject-wise gradient and Hessian

Ben Bolker bbolker at gmail.com
Tue Sep 23 19:42:22 CEST 2014


  I agree that it's not going to be very easy, but I would suggest that
the code at
https://github.com/lme4/lme4/blob/master/misc/notes/laplDiag.Rmd would
be a good starting point: what this function does (for GLMMs) is to
compute the deviance residuals for a particular set of conditional mode
offsets and sum them by group.  You could probably adapt this approach
to compute the gradient and Hessian of the deviance residuals by
individual ...

On 14-09-23 11:41 AM, Douglas Bates wrote:
> On Tue, Sep 23, 2014 at 10:30 AM, Vahid Nassiri <Vahid.Nassiri at kuleuven.be>
> wrote:
> 
>>  Thanks for the paper and the explanations,
>>
>>  But can I change the objective function myself?
>> Like something should be changed in "lmer" itself, or when I update it
>> with "devFunOnly=TRUE" ?
>>
> 
> It is not trivial to do.  I would start with the lme4pureR package, which
> may exist only on github
> 
> install.packages("devtools")
> devtools::install_github("lme4/lme4pureR")
> 
> to see how the profiled deviance or profiled REML criterion is evaluated.
> 
> 
>>
>>  Vahid.
>>  ------------------------------
>> *From:* dmbates at gmail.com [dmbates at gmail.com] on behalf of Douglas Bates [
>> bates at stat.wisc.edu]
>> *Sent:* Tuesday, September 23, 2014 5:11 PM
>> *To:* Vahid Nassiri
>> *Cc:* r-sig-mixed-models at r-project.org
>> *Subject:* Re: [R-sig-ME] Subject-wise gradient and Hessian
>>
>>   The function being optimized when fitting a linear mixed-effects model
>> is the profiled deviance or the profiled REML criterion.  Details are given
>> in http://arxiv.org/abs/1406.5823.  The term "profiled" means that,
>> conditional on values of parameters that determine the relative covariance
>> matrix of the random effects, the optimal values of the fixed-effects
>> coefficients and the residual variance parameter, can be determined from
>> the solution of a penalized least-squares problem.  This reduces the
>> complexity of the optimization problem considerably and leads to faster and
>> more robust algorithms for fitting such models.
>>
>>  However, it also means that parameters being optimized are indirectly
>> related to what we view as the statistically meaningful parameters in the
>> model.  If you need the gradient and Hessian with respect to the
>> fixed-effects parameters or the variance components you should write the
>> objective function in terms of those parameters.
>>
>> On Tue, Sep 23, 2014 at 7:39 AM, Vahid Nassiri <Vahid.Nassiri at kuleuven.be>
>> wrote:
>>
>>> Dear all,
>>>
>>> I just use these simple codes to obtain gradient and Hessian of the
>>> fitted model to Orthodont data (for example) for different parameters,
>>>
>>>> fm1 <- lmer(scale(distance)~Sex*I(scale(age))+(I(scale(age))|Subject),
>>> data=Orthodont)
>>>> fm1 at optinfo$derivs$gradient
>>> [1] 1.578826e-07 1.200391e-06 4.504841e-08
>>>> fm1 at optinfo$derivs$Hessian
>>>            [,1]       [,2]       [,3]
>>> [1,]  30.766258  -4.868289 -13.459066
>>> [2,]  -4.868289 128.743919   1.404892
>>> [3,] -13.459066   1.404892  58.653149
>>>
>>> So, it seems I have seven parameters, 4 fixed effects, 2 random effects,
>>> and one error variance, he so why I get a 3 \times 3 Hessian and 3 \times 1
>>> gradient?
>>>
>>> I also tried to use:
>>>
>>>> fm1_update <- update(fm1,devFunOnly=TRUE)
>>>> params <- unlist(getME(fm1,c("theta","beta")))
>>>> grad(fm1_update,params)
>>> (7!=3)
>>> Error: theta size mismatch
>>>>
>>>
>>> But you see the error I got, it only works fine when I just include
>>> "theta" and not the fixed effects "beta"
>>>
>>> I wonder if anyone have an idea about how I can obtain gradient and
>>> Hessian for all the parameters in the model.
>>>
>>> Afterward, I would try to find a way to calculate these quantities by
>>> subject. That would be probably possible by fitting a new model for each
>>> subject, but starting from fitted values and just for one iteration. But
>>> that is the next step after I know how I can obtain gradient and Hessian
>>> for each parameter at the first place.
>>>
>>> Thanks!
>>> Vahid.
>>>
>>>         [[alternative HTML version deleted]]
>>>
>>> _______________________________________________
>>> R-sig-mixed-models at r-project.org mailing list
>>> https://stat.ethz.ch/mailman/listinfo/r-sig-mixed-models
>>>
>>
>>
> 
> 	[[alternative HTML version deleted]]
> 
> _______________________________________________
> R-sig-mixed-models at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-sig-mixed-models
>



More information about the R-sig-mixed-models mailing list