[R] FW: logistic regression
Frank E Harrell Jr
f.harrell at vanderbilt.edu
Tue Sep 30 03:50:45 CEST 2008
Greg Snow wrote:
>> -----Original Message-----
>> From: r-help-bounces at r-project.org [mailto:r-help-bounces at r-
>> project.org] On Behalf Of Frank E Harrell Jr
>> Sent: Saturday, September 27, 2008 7:15 PM
>> To: Darin Brooks
>> Cc: dieter.menne at menne-biomed.de; r-help at stat.math.ethz.ch;
>> ted.harding at manchester.ac.uk
>> Subject: Re: [R] FW: logistic regression
>>
>> Darin Brooks wrote:
>>> Glad you were amused.
>>>
>>> I assume that "booking this as a fortune" means that this was an
>> idiotic way
>>> to model the data?
>> Dieter was nominating this for the "fortunes" package in R. (Thanks
>> Dieter)
>>
>>> MARS? Boosted Regression Trees? Any of these a better choice to
>> extract
>>> significant predictors (from a list of about 44) for a measured
>> dependent
>>> variable?
>> Or use a data reduction method (principal components, variable
>> clustering, etc.) or redundancy analysis (to remove individual
>> predictors before examining associations with Y), or fit the full model
>> using penalized maximum likelihood estimation. lasso and lasso-like
>> methods are also worth pursuing.
>
> Frank (and any others who want to share an opinion):
>
> What are your thoughts on model averaging as part of the above list?
Model averaging has good performance but no advantage over fitting a
single complex model using penalized maximum likelihood estimation.
Frank
>
>
> --
> Gregory (Greg) L. Snow Ph.D.
> Statistical Data Center
> Intermountain Healthcare
> greg.snow at imail.org
> 801.408.8111
>
>
>
--
Frank E Harrell Jr Professor and Chair School of Medicine
Department of Biostatistics Vanderbilt University
More information about the R-help
mailing list