[R] linear regression: evaluating the result Q

Prof Brian Ripley ripley at stats.ox.ac.uk
Thu Sep 16 18:03:09 CEST 2004


On Thu, 16 Sep 2004, RenE J.V. Bertin wrote:

> Dear all,
> 
> A few quick questions about interpreting and evaluating the results of
> linear regressions, to which I hope equally quick answers are possible.
> 
> 1) The summary.lm method prints the R and R^2 correlation coefficients
> (something reviewers like to see). It works on glm objects and (after
> tweaking it to initialise z$df.residual with rdf) also on rlm objects.
> Are the R, R^2 and also the p values reported reliable for these fit
> results? If not, how do I calculate them best?

Well, for rlm no, as it is not least-squares fitting and R^2 is very 
suseptible to outliers.  For glm, not really unless it is a Gaussian 
model.

> 2) For a simple 1st order linear fit, what is the best way to calculate
> the (95%) confidence interval on/of the slope?

Use confint.  (MASS chapter 7 has examples.)

> 3) The p values reported for the calculated coefficients and intercept
> indicate to what extent these values are significantly different from
> zero (right?). 

Yes.

> Aside from question 2), what is the best way to compare
> the calculated slope with another slope (say of the unity line)?

Use offset, as in y ~ x + offset(x) and test for the coefficient of x to
be zero.  (That's R only, BTW.)

-- 
Brian D. Ripley,                  ripley at stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford,             Tel:  +44 1865 272861 (self)
1 South Parks Road,                     +44 1865 272866 (PA)
Oxford OX1 3TG, UK                Fax:  +44 1865 272595




More information about the R-help mailing list