[R-sig-ME] Question regarding the glmer function from the LME4 package

Douglas Bates dmb@te@ @end|ng |rom gm@||@com
Mon Sep 4 18:31:37 CEST 2023


In your simulation it looks like you have a separate random effect (element
of z) for each observation (element of y).  Is that intentional?

It seems that this model, while perhaps theoretically identifiable, is
going to be poorly estimated as it will be difficult to separate the
influence of the random effects from the per-observation noise.

Generally we think of each element of the random effects being associated
with several observations - that `id`, the grouping factor has fewer levels
than observations.  In such cases it is difficult to make sense of
contributions from individual observations to the log-likelihood.  If you
only have one grouping factor from random effects you can evaluate the
log-likelihood as a sum of contributions from each element of the random
effects - which is why Gauss-Hermite integration can be applied to scalar
integrals instead of Laplace's approximation.  But still the log-likelihood
requires integration with respect to the random effects to obtain the
marginal distribution of the observations which is then evaluated at the
observed responses.

I think the simplifications available in the generalized linear model (i.e.
a model without random effects) are a source of confusion here.  In that
case the deviance (negative twice the log-likelihood) can be expressed as
the sum of the "unit deviances" for each observation because the individual
responses are independent in the probability model.  That is not the case
for GLMMs.  We still use the unit deviances in the evaluation of
(approximations to) the deviance of the GLMM but the sum of these unit
deviances is only part of the deviance of the model itself.

I have written about how I understand the GLMM formulation in an appendix
of the in-progress book
https://juliamixedmodels.github.io/EmbraceUncertainty/  We use Julia and
the MixedModels.jl package in that book but the approach is similar to that
in lme4.

On Mon, Sep 4, 2023 at 9:32 AM Alexandra Lefebvre <
alexandra.lefebvre using math.cnrs.fr> wrote:

> Dear colleagues,
>
> I am using the LME4 package, in particular the glmer function, in the
> framework of an EM algorithm. I would need to retrieve individual
> log-likelihoods (for each observation) and I am wondering if that output
> can be obtained from the package.
>
> As an example, here is a short code (in blue) :
>
> require(lme4)
> set.seed(1)
>
> theta.star = 2.5
> beta.star = -2
>
> z = rnorm(2000);
> x = rbinom(2000, 50, exp(beta.star + z * theta.star) / (1 + exp(beta.star
> + z * theta.star)))
> dat = data.frame(x = x, id = 1: length(x))
> fit = glmer(cbind(x, 50-x) ~ 1 + (1 | id), data = dat, family = "binomial")
>
> theta = fit using theta               # 2.499677
> beta = fit using beta                 # -1.988883
>
> #individual loglik
> ind.loglik = rep(NA, length(x))
> for (i in 1:length(x)) {
>   f = function(u) dbinom(x[i], 50, exp(beta + theta * u) / (1 + exp(beta +
> theta * u))) * dnorm(u)
>   ind.loglik[i] = log(integrate(f, -5, 5)$val)
> }
>
> As far as I understood you use Laplace or Gauss-Hermite method. I guess
> each individual log-likelihood is calculated for fitting the model.  Do you
> know any way for retrieving the vector of individual log-likelihood
> directly as an output from your package ?
>
> Alternatively, I read in your notes that The Laplace approximation
> correction terms for converting a conditional log-likelihood into a
> marginal log-likelihood are `gm1 using pp$sqrL(1)` and `gm1 using pp$ldL2()). Is it
> available at an individual level ?
>
> Furthermore, I am surprised that the results of the following two line
> codes are not almost equal and therefore I wonder if my understanding of
> the output in the above code is correct.
> sum(ind.loglik)         # -6774.249
> logLik(fit)                     # 'log Lik.' -6807.348 (df=2)
>
>
> King regards,
> Alexandra Lefebvre
>
> —————————————
> Post-doctoral researcher at CIRB (Collège de France) and LJLL (Sorbonne
> Université, Paris, France)
> CIRB - Collège de France, 11, place Marcelin-Berthelot, 75231 Paris Cedex
> 05.
> Bat.B niveau 1 - Pièce 104.
>
>
>         [[alternative HTML version deleted]]
>
> _______________________________________________
> R-sig-mixed-models using r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-sig-mixed-models
>

	[[alternative HTML version deleted]]



More information about the R-sig-mixed-models mailing list