[R] FW: logistic regression

Greg Snow Greg.Snow at imail.org
Mon Sep 29 19:24:28 CEST 2008

> -----Original Message-----
> From: r-help-bounces at r-project.org [mailto:r-help-bounces at r-
> project.org] On Behalf Of Frank E Harrell Jr
> Sent: Saturday, September 27, 2008 7:15 PM
> To: Darin Brooks
> Cc: dieter.menne at menne-biomed.de; r-help at stat.math.ethz.ch;
> ted.harding at manchester.ac.uk
> Subject: Re: [R] FW: logistic regression
> Darin Brooks wrote:
> > Glad you were amused.
> >
> > I assume that "booking this as a fortune" means that this was an
> idiotic way
> > to model the data?
> Dieter was nominating this for the "fortunes" package in R.  (Thanks
> Dieter)
> >
> > MARS?  Boosted Regression Trees?  Any of these a better choice to
> extract
> > significant predictors (from a list of about 44) for a measured
> dependent
> > variable?
> Or use a data reduction method (principal components, variable
> clustering, etc.) or redundancy analysis (to remove individual
> predictors before examining associations with Y), or fit the full model
> using penalized maximum likelihood estimation.  lasso and lasso-like
> methods are also worth pursuing.

Frank (and any others who want to share an opinion):

What are your thoughts on model averaging as part of the above list?

Gregory (Greg) L. Snow Ph.D.
Statistical Data Center
Intermountain Healthcare
greg.snow at imail.org

More information about the R-help mailing list