[R-sig-ME] Standard Error of a coef. in a 2-level model vs. 2 OLS models

Simon Harmel @|m@h@rme| @end|ng |rom gm@||@com
Mon Sep 14 01:37:38 CEST 2020


Dear All,

I have fit two ols models (ols1 & ols2) and an mixed-effects model (m1).
ols1 is a simple lm() model that ignores the second-level. ols2 is a simple
lm() model that ignores the first-level.

For `ols1` model,  `sigma(ols1)^2` almost equals sum of variance components
in the `m1` model: 6.68 (bet.) + 39.15 (with.)
For `ols2` model, I wonder what does `sigma(ols2)^2` represents when
compared to the `m1` model?

Here is the fully reproducible code:

library(lme4)
library(tidyverse)

hsb <- read.csv('
https://raw.githubusercontent.com/rnorouzian/e/master/hsb.csv')
hsb_ave <- hsb %>% group_by(sch.id) %>% mutate(math_ave = mean(math)) %>%
slice(1) # data that only considers grouping but ignores lower level

ols1 <- lm(math ~ sector, data = hsb)
summary(ols1)

m1 <- lmer(math ~ sector + (1|sch.id), data = hsb)
summary(m1)

# `sigma(ols1)^2` almost equals 6.68 (bet.) + 39.15 (with.) from lmer

But if I fit another ols model that only considers the grouping structure
(ignoring lower level):

ols2 <- lm(math_ave ~ sector, data = hsb_ave)
summary(ols2)

Then what does `sigma(ols2)^2` should amount to when compared to the `m1`
model?

	[[alternative HTML version deleted]]



More information about the R-sig-mixed-models mailing list