[R-sig-ME] A question about multilevel models using the lmer package in R

Phillip Alday me @end|ng |rom ph||||p@|d@y@com
Wed Sep 14 10:30:37 CEST 2022


This sounds a bit like Hedge's Reliability Paradox:

Hedge, C., Powell, G. & Sumner, P. The reliability paradox: Why robust
cognitive tasks do not produce reliable individual differences. /Behav
Res / *50*, 1166–1186 (2018). https://doi.org/10.3758/s13428-017-0935-1

There is also an older result due to Spearman about bias of variance
estimates under certain conditions, though I can't recall the details
off the top of my head.

(sent a second time without PGP signing -- would it be possible to allow
PGP to the allowed MIME types?)

On 9/13/22 16:56, Ben Bolker wrote:
>   (This came through twice, for what it's worth.)
>
>   I can't immediately explain the phenomenon you observed (variance at
> an upper level increasing when a covariate is added), but I'm not
> shocked -- when there are multiple levels of variation, variation can
> shift among them in surprising ways.
>
>   Can you clarify/expand on what you mean by "[evaluating] the impact
> of a predictor on student/school-level intercepts"?
>
> On 2022-09-13 10:46 a.m., Verena Hinze wrote:
>> Dear mailing list at R-sig-mixed-models,
>>
>> I am very interested in using multilevel growth analyses for my
>> research.
>> I have come across a lot of very helpful tutorials on the internet,
>> recommending the lmer package.
>> However, I have one question for which I didn't manage to find the
>> answer yet, and I was wondering whether you might be able to point me
>> in the right direction...
>>
>> Specifically, I am interested in modelling three-level data (students
>> nested within schools and within time). I am hoping to predict
>> student outcomes by student-level and school-level predictors, and I
>> have been using the lmer package in R to model the data.
>>
>> However, in the model output, I have noticed that sometimes the
>> variance (for the student- or school-level intercepts or slopes)
>> increases instead of decreases, compared to the simpler model without
>> the respective predictor. This is contrary to what we would expect
>> from ordinary regression analyses, where we expect the variance to
>> decrease, if we add predictors to the model that help us to explain
>> such variance.
>>
>> I was wondering what might be going on here? Have you encountered
>> something similar before? And how could we evaluate the impact of a
>> predictor on these student-/school-level intercepts and slopes in
>> multi-level models instead?
>>
>> Any advice would be highly appreciated.
>>
>> With many thanks and kind regards,
>>
>> Verena
>>
>> Verena Hinze
>> Postdoctoral Research Fellow
>> Oxford Precision Psychiatry Lab
>> University of Oxford, Department of Psychiatry
>> Warneford Lane, Oxford, OX3 7JX ​
>> E: verena.hinze using psych.ox.ac.uk
>>
>>     [[alternative HTML version deleted]]
>>
>> _______________________________________________
>> R-sig-mixed-models using r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-sig-mixed-models
>
	[[alternative HTML version deleted]]



More information about the R-sig-mixed-models mailing list