[R] Standardized beta-coefficients in regression

Dr. Walter H. Schreiber whschreiber at onlinehome.de
Tue Jan 17 07:03:23 CET 2006

I had intended not to answer since I had a question regarding R and not about 
justification of a specific statistic. But nonetheless:

(a) standardized betas are liberated of the original scale and facilitates 
reading off the relative influcence of a variable on the hyperplane. I find 
it easy to think of deviations in SD.

(b) If you do model building and follow up un the change of betas you soon 
relize that they are fine indicators and give hints about multicollinearity 
(yes, I know about vif(), although I prefer 1/vif(); I also know that I can 
check for intercorrrelations in advance).

(c) as Peter mentioned, APA requires it.

(d) to assist converts from Sxx[x] who might miss something.



Am Dienstag, 17. Januar 2006 01:21 schrieb Peter Flom:
> On Mon, 16 Jan 2006, Dr. Walter H. Schreiber wrote:
> > Hello list,
> >
> > I am used to give a lot of attention to the standardized regression
> > coefficients, which in SPSS are listed automatically.
> >
> >>> Prof Brian Ripley <ripley at stats.ox.ac.uk>  >>> replied
> <<<
> I do wonder why?  Most people I have encountered who do that are
> interpreting them in invalid ways.
> and
> Yes, but why do you want one?  (You don't need summary, just coef, in
> the
> second line, and you also do not need an intercept.)  For a single
> regressor as here, just cor(ctl, trt).
> One reason might be that the American Psychological Ass'n requres them
> for it's regression tables.  As to why the APA requires them, I couldn't
> say,
> but require them they do.
> Peter

Dr. Walter H. Schreiber

More information about the R-help mailing list