[R-sig-ME] Using R, how to present mixed models vs. regular linear regression models?

Paul Johnson pauljohn32 at gmail.com
Sat Jun 2 20:49:31 CEST 2012


On Thu, May 31, 2012 at 9:52 AM, Michael <comtech.usa at gmail.com> wrote:
> What are the key differences between the following two models?
> lmefit = lmer(MathAch ~ SES + (1 |School) , MathScores)
>
> lmfit = lm(MathAch ~ SES + School -1 , MathScores)
>
> To me, they seem to be the same, except that lmefit takes less parameters
> (because it used Normal distribution to model the levels at the group
> level...)
>
Dear Michael

This is a big general philosophical issue, not an R lmer question. You
have quite a bit of homework to do. Your claim is both right and wrong
at the same time, I think.

For background reading, i suggest you study the difference between
fixed effects ('least squares dummy variables") and random effects
models.

For your question about what is being optimized, I suggest you start
with the Pinheiro and Bates book from 2000, and then also study the
papers that Doug Bates has made available and his book chapters.
http://lme4.r-forge.r-project.org/book/.  There is one I can't find at
the moment, but it contrasts the GLS with the Penalized ML views. But
I expect you can Google better than I can. Maybe the best entry point
there is the useR workshop he offered,  Tutorial: Fitting and
evaluating mixed models using lme4
http://user2010.org/tutorials/Bates.html. Or these slides ...
http://lme4.r-forge.r-project.org/slides/2011-01-11-Madison/6NLMM.pdf.

I have a folder of pdf manuscripts I've saved over the years. Here are
things to look for.  John Fox has an online chapter to supplement his
Companion to Applied Regression. I've found the (many, many) articles
and books by Sophia Rabe-Hesketh and Anders Skrondal to be more
understandable than most things about methodology. (google for The
Stata Journal (2002)). They also have a brand new edition of their
mixed models with stata book, which I think is quite good (even though
it does not use R).  There is an essay about this that goes with the R
package glmmML that I think is quite helpful. "Generalized linear
models
with random intercepts" by Goran Boostrom. Also, google for "A First
Look at MultiLevel Models" Georges Monett.

I think you will see there are competing interpretations of mixed
models. The most obvious iterpretation (in my view) is Generalized
Least Squares, that's one I can really understand. However, Prof.
Bates suggests instead you should view it as a penalized maximum
likelihood exercise. I defer to that view and try to understand it,
and I suggest you should too. I recently fell in love with a book by
Simon Wood, Generalized Additive Models (2008).  The GAM part is fine,
but the first 100 pages that lead up to it offer a very superior
explanation of regression modeling, the Generalized Linear Model, and
then mixed effects models.

Good luck, I'm sorry I don't have a comprehensive reading list, but if
you build one, share it back to me :)
pj


-- 
Paul E. Johnson
Professor, Political Science    Assoc. Director
1541 Lilac Lane, Room 504     Center for Research Methods
University of Kansas               University of Kansas
http://pj.freefaculty.org            http://quant.ku.edu



More information about the R-sig-mixed-models mailing list