[R] meta-regression, MiMa function, and R-squared

Viechtbauer Wolfgang (STAT) Wolfgang.Viechtbauer at STAT.unimaas.nl
Mon Mar 12 15:36:48 CET 2007


Yes, there is indeed a slight difference. The models fitted by lm() using the weights option (and this is the same in essentially all other software) assume that the weights are known up to a constant. The parameter estimates will be exactly the same, but the standard errors of the estimates will differ by exactly that constant. If you divide the standard errors that you get from lm() with the weights option by the residual standard error, then you get exactly the same standard errors as those given by the mima() function. Fortunately, that multiplicative constant has no bearing on the value of R^2. You can see this by using "lm(y ~ x1 + ... + xp, weights=w*10)". The value of R^2 is unchanged.

Best,

-- 
Wolfgang Viechtbauer 
 Department of Methodology and Statistics 
 University of Maastricht, The Netherlands 
 http://www.wvbauer.com/ 



-----Original Message-----
From: Christian Gold [mailto:c.gold at magnet.at] 
Sent: Monday, March 12, 2007 13:35
To: Viechtbauer Wolfgang (STAT)
Cc: r-help at stat.math.ethz.ch
Subject: Re: meta-regression, MiMa function, and R-squared


Dear Wolfgang

Thanks for your prompt and clear response concerning the R^2. You write:

> Note that the mima function does nothing else but fit the model with
weighted least squares using those weights. So, you could actually use "lm(y ~ x1 + ... + xp, weights=w)" and you should get the exact same parameter estimates.  Therefore, "summary(lm(y ~ x1 + ... + xp, weights=w))" will give you R^2.

Is this really true? I thought that "in weighted regression the /relative/ weights are assumed known whereas in meta-regression the /actual/ weights are assumed known" (Higgins & Thompson, 2004, "Controlling the risk of spurious findings from meta-regression", Statistics in Medicine, 23, p. 1665). Also, I did calculate my regression problem with lm using inverse variance weights before I discovered your function, and have compared the results now. The regression coefficient was the same, but the confidence interval was wider with mima. Furthermore, the CI with mima depended on the absolute size of the weights (as I assume it should do), whereas with lm it did not. Can you explain?

Thanks

Christian



More information about the R-help mailing list