# [R] Mean Squares

Douglas Bates bates at stat.wisc.edu
Fri Feb 28 19:23:43 CET 2003

```"Pedro J. Aphalo" <pedro.aphalo at cc.jyu.fi> writes:

> Douglas Bates wrote:
> >
> > bhx2 at mevik.net (Bjørn-Helge Mevik) writes:
> >
> > > Mona Riihimaki <mona at sun3.oulu.fi> writes:
> > >
> > > > I've done lme-analysis with R; [...] I'd need also the mean squares.
> > >
> > > AFAIK, lme doesn't calculate sum of squares (or mean squares).  It
> > > maximises the likelihood (or restricted likelihood) and uses tests
> > > based on likelihood ratios.
> >
> > Yes - you are correct.
> >
> although the function is called anova.lme, is it still correct to talk
> about "anova results" when referring to the results of these tests? and
> in the case of the Wald tests in the single lme object case?

Anova applied to lme objects generates different types of tests
according to whether it is used with one argument or more than one
argument.  (We took Oscar Wilde's admonition that "Consistency is the
last refuge of the unimaginative" to heart.)

With more than one argument, likelihood ratio statistics and their
p-values are returned.  These are appropriate for comparing models in
which the random-effects structure has changed.  Bear in mind that the
p-values can be conservative because the null hypothesis is usually on
the boundary of a constrained parameter space.

With a single argument, F-tests on terms in the fixed-effects part of
the model are returned.  These tests are conditional on the values of
the parameters determining the random-effects distribution.  This is
usually not a problem because these parameters are asymptotically
uncorrelated with the fixed-effects parameters.

I would refer to the results for more than one argument as
"conservative likelihood ratio tests" or just "likelihood ratio
tests".

```