[R-sig-ME] Why am I getting a Variance of 0 for my random effect

Douglas Bates bates at stat.wisc.edu
Wed Aug 11 22:05:06 CEST 2010


On Wed, Aug 11, 2010 at 2:18 PM, Kevin E. Thorpe
<kevin.thorpe at utoronto.ca> wrote:
> On 08/11/2010 02:25 PM, Douglas Bates wrote:
>>
>> On Wed, Aug 11, 2010 at 1:13 PM, Kevin E. Thorpe
>> <kevin.thorpe at utoronto.ca>  wrote:
>>>
>>> Hello.
>>
>>> I'm getting a variance of 0 on a random effect and I don't know why.
>>
>> It's not a bug - it's a feature.  ML estimates or REML estimates of
>> variance components can be zero.  This simply indicates that the
>> variability in the response associated with the factor, RN in your
>> case, is not sufficient to warrant the additional complexity in the
>> model.
>>
>>> I suspect I've not set the model up correctly.  My transcript is below
>>> with my own comments sprinkled in for time to time.
>>>
>>> A little bit about the data (which I will provide off-list if requested).
>>>  We have nurses managing an aspect of patient care
>>> according to different algorithms.  Interest focuses on of the
>>> algorithms result in different outcomes.  I have restricted this
>>> to only nurses who did each algorithm twice (in case my problem
>>> was being caused by some nurses doing only one algorithm, possibly
>>> only one time).
>>>
>>> I figured that since I have multiple observations per nurse, I
>>> should treat nurse as a random effect, but maybe I confused myself
>>> again.
>>
>> You are quite correct that it is appropriate to allow for the
>> possibility of RN having an effect on the response and that it should
>> be incorporated as a random effect if it were in the model but the
>> results indicate that RN does not have sufficient effect on the
>> response.
>
> Thanks Doug.  Your response is helpful, as always.  If RN does not
> contribute a random effect, would it be appropriate to revert to a
> standard regression model, or is it best to leave the unimportant
> random effect?  In the case of ordinary regression, dropping
> variables based on their p-values compromises inference.  Does the
> same apply with dropping a random effect with no variance?

When the random effects variance is zero the model reverts to the
linear regression model in the sense that the two models give the same
predicted values, the same log-likelihood, coefficient estimates for
the fixed effects and standard errors.  That is, there is no need to
continue to represent the model as a linear mixed model if the only
variance component parameter's estimated value is zero.

> Kevin
>
>>
>>>
>>> R version 2.11.1 Patched (2010-07-21 r52598)
>>> Copyright (C) 2010 The R Foundation for Statistical Computing
>>> ISBN 3-900051-07-0
>>>
>>> R is free software and comes with ABSOLUTELY NO WARRANTY.
>>> You are welcome to redistribute it under certain conditions.
>>> Type 'license()' or 'licence()' for distribution details.
>>>
>>>  Natural language support but running in an English locale
>>>
>>> R is a collaborative project with many contributors.
>>> Type 'contributors()' for more information and
>>> 'citation()' on how to cite R or R packages in publications.
>>>
>>> Type 'demo()' for some demos, 'help()' for on-line help, or
>>> 'help.start()' for an HTML browser interface to help.
>>> Type 'q()' to quit R.
>>>
>>>> library(lattice)
>>>> library(lme4)
>>>
>>>> str(data1)
>>>
>>> 'data.frame':   72 obs. of  3 variables:
>>>  $ RN        : int  1 1 2 3 7 7 9 9 15 15 ...
>>>  $ Assignment: Factor w/ 2 levels "E","N": 1 1 1 1 1 1 1 1 1 1 ...
>>>  $ AUChr     : num  12.26 7.23 9.26 4.04 10.31 ...
>>>>
>>>> tmp1<-
>>>> with(data1,aggregate(AUChr,list(RN=RN,Assigment=Assignment),mean))
>>>> names(tmp1)[3]<- "Mean"
>>>>
>>>> tmp2<-
>>>> with(data1,aggregate(AUChr,list(RN=RN,Assignment=Assignment),var))
>>>> names(tmp2)[3]<- "Variance"
>>>>
>>>> meanvar<- merge(tmp1,tmp2)
>>>
>>> The point of this is to show that the means are not all the same,
>>> nor are the variances.
>>>
>>>> meanvar
>>>
>>>   RN Assignment   Mean  Variance
>>> 1   1          E  9.745  12.65045
>>> 2   1          N  7.185   1.36125
>>> 3  15          E 10.605  15.07005
>>> 4  15          N 10.385   4.41045
>>> 5  16          E  8.175   0.00845
>>> 6  16          N  8.420   1.03680
>>> 7   2          E  7.300   7.68320
>>> 8   2          N  6.950   1.00820
>>> 9  21          E  9.670   9.41780
>>> 10 21          N 10.535   2.44205
>>> 11 22          E  7.720   2.04020
>>> 12 22          N  7.930   1.21680
>>> 13 24          E  9.555  10.35125
>>> 14 24          N  9.330   0.38720
>>> 15 25          E  8.240   0.92480
>>> 16 25          N  9.485   0.00125
>>> 17 27          E  8.635   0.08405
>>> 18 27          N  7.745   3.72645
>>> 19 28          E  9.635   8.61125
>>> 20 28          N  8.315  10.35125
>>> 21  3          E  6.005   7.72245
>>> 22  3          N 11.435  55.44045
>>> 23 31          E  9.590   9.94580
>>> 24 31          N 10.570  16.70420
>>> 25 35          E  9.055   0.32805
>>> 26 35          N  9.925  14.41845
>>> 27 36          E  9.040   2.08080
>>> 28 36          N  7.395   1.14005
>>> 29  5          E  8.430   3.38000
>>> 30  5          N 17.385 139.94645
>>> 31  6          E  6.930   0.24500
>>> 32  6          N  8.330   1.72980
>>> 33  7          E 10.650   0.23120
>>> 34  7          N  7.375   0.09245
>>> 35  9          E  8.885   7.56605
>>> 36  9          N  8.405   0.73205
>>>
>>> Model with "Assignment" (algorithm).
>>>
>>>> lmer(AUChr~Assignment+(1|RN),data=data1,REML=FALSE)
>>>
>>> Linear mixed model fit by maximum likelihood
>>> Formula: AUChr ~ Assignment + (1 | RN)
>>>   Data: data1
>>>   AIC   BIC logLik deviance REMLdev
>>>  365.7 374.8 -178.8    357.7   356.9
>>> Random effects:
>>>  Groups   Name        Variance Std.Dev.
>>>  RN       (Intercept) 0.0000   0.0000
>>>  Residual             8.4152   2.9009
>>> Number of obs: 72, groups: RN, 18
>>>
>>> Fixed effects:
>>>            Estimate Std. Error t value
>>> (Intercept)   8.7703     0.4835   18.14
>>> AssignmentN   0.5131     0.6837    0.75
>>>
>>> Correlation of Fixed Effects:
>>>            (Intr)
>>> AssignmentN -0.707
>>>
>>>
>>> Model without the algorithm variable.
>>>
>>>> lmer(AUChr~(1|RN),data=data1,REML=FALSE)
>>>
>>> Linear mixed model fit by maximum likelihood
>>> Formula: AUChr ~ (1 | RN)
>>>   Data: data1
>>>   AIC   BIC logLik deviance REMLdev
>>>  364.3 371.1 -179.1    358.3   358.5
>>> Random effects:
>>>  Groups   Name        Variance Std.Dev.
>>>  RN       (Intercept) 0.000    0.0000
>>>  Residual             8.481    2.9122
>>> Number of obs: 72, groups: RN, 18
>>>
>>> Fixed effects:
>>>            Estimate Std. Error t value
>>> (Intercept)   9.0268     0.3432    26.3
>>>>
>>>> sessionInfo()
>>>
>>> R version 2.11.1 Patched (2010-07-21 r52598)
>>> Platform: i686-pc-linux-gnu (32-bit)
>>>
>>> locale:
>>>  [1] LC_CTYPE=en_US       LC_NUMERIC=C         LC_TIME=en_US
>>>  [4] LC_COLLATE=C         LC_MONETARY=C        LC_MESSAGES=en_US
>>>  [7] LC_PAPER=en_US       LC_NAME=C            LC_ADDRESS=C
>>> [10] LC_TELEPHONE=C       LC_MEASUREMENT=en_US LC_IDENTIFICATION=C
>>>
>>> attached base packages:
>>> [1] stats     graphics  grDevices utils     datasets  methods   base
>>>
>>> other attached packages:
>>> [1] lme4_0.999375-34   Matrix_0.999375-42 lattice_0.18-8
>>>
>>> loaded via a namespace (and not attached):
>>> [1] grid_2.11.1   nlme_3.1-96   stats4_2.11.1
>>>>
>>>> proc.time()
>>>
>>>   user  system elapsed
>>>  3.488   0.056   3.536
>
>
> --
> Kevin E. Thorpe
> Biostatistician/Trialist, Knowledge Translation Program
> Assistant Professor, Dalla Lana School of Public Health
> University of Toronto
> email: kevin.thorpe at utoronto.ca  Tel: 416.864.5776  Fax: 416.864.3016
>




More information about the R-sig-mixed-models mailing list