[R-sig-ME] Residual Sum Squares Issues with Linear Mixed Models

Douglas Bates bates at stat.wisc.edu
Fri Nov 20 15:50:29 CET 2009


On Fri, Nov 20, 2009 at 4:23 AM,  <Howsun.Jow at newcastle.ac.uk> wrote:
> I'm having problems understanding why the residual sum squares for a reduced
> linear mixed model is sometimes smaller for a "reduced" model than for a
> "full" model. Take the "Pastes" dataset for example:
>
> fm3M <- lmer(strength ~ 1 + (1|batch) + (1|sample), Pastes), REML=F)
> fm4M <- lmer(strength ~ 1 + (1|sample), Pastes), REML=F)
>
>> sum(resid(fm3M)^2)
> [1] 21.04984
>> sum(resid(fm4M)^2)
> [1] 21.03147
>
> The reduced model seems to fit the data better than the full model. Is there
> something fundamental I'm missing about linear mixed effects models?

The maximum likelihood estimates of the parameters in a linear
mixed-effects model are not the least squares estimates.  The
conditional means of the random effects and the estimates of the
fixed-effects parameters are penalized least squares estimates.  When
you remove one random effects term you may change the estimates of
variance of the random effects for the other term, resulting in a
different penalty and possibly a larger sum of squares of residuals.




More information about the R-sig-mixed-models mailing list