[R] alternative to logistic regression

Greg Snow Greg.Snow at intermountainmail.org
Mon Nov 19 19:37:50 CET 2007


Why not try it out for yourself to see how much the predictions change:

x <- runif(100, -1, 1)
p <- exp(3*x)/(1+exp(3*x))
y <- rbinom(100, 1, p)

plot(x,p, xlim=c(-1,1), ylim=c(0,1), col='blue')
points(x,y)
xx <- seq(-1,1, length=250)
lines(xx, exp(3*xx)/(1+exp(3*xx)), col='blue')

fit1 <- glm( y ~ x, family=binomial )
fit2 <- glm( y ~ cut( x, seq(-1,1,0.2) ), family=binomial )

points( x, predict(fit1, type='response'), col='red')
points( x, predict(fit2, type='response'), col='green')

Hope this helps,

-- 
Gregory (Greg) L. Snow Ph.D.
Statistical Data Center
Intermountain Healthcare
greg.snow at intermountainmail.org
(801) 408-8111
 
 

> -----Original Message-----
> From: r-help-bounces at r-project.org 
> [mailto:r-help-bounces at r-project.org] On Behalf Of 
> markleeds at verizon.net
> Sent: Friday, November 16, 2007 10:28 AM
> To: Prof Brian Ripley; markleeds at verizon.net
> Cc: r-help at r-project.org; Terry Therneau
> Subject: Re: [R] alternative to logistic regression
> 
> >From: Prof Brian Ripley <ripley at stats.ox.ac.uk>
> >Date: 2007/11/16 Fri AM 09:44:59 CST
> >To: markleeds at verizon.net
> >Cc: Terry Therneau <therneau at mayo.edu>, r-help at r-project.org
> >Subject: Re: Re: [R] alternative to logistic regression
> 
> Thanks Brian: I'll look at the MASS book example for sure but 
> I don't think I was so clear in my last question so let me 
> explain again.
> 
> What I meant to say was : 
> 
> Suppose Person A and Person B both have the same raw data 
> which is categorical response ( say 3 responses ) and 1 
> numeric predictor.
> 
> Now, suppose person A fits a logit regression with the logit 
> link  and family = binomal  so that it's an S curve in the  
> probability space and the the predictor was numeric so the x 
> axis was numeric.
> 
> suppose person B fits a logit regression with the logit link 
> and family = binomal  so that it's an S curve in the  
> probability space and the the predictor was a factor so the x 
> axis was say deciles.
> 
> They both then predict off of their respective models given a 
> new value of the predictor ( Person A's predictor is in the 
> form of a number and Person B's predictor is say a decile 
> where the number fell in.
> 
> Would their forecast of the probability given that predictor 
> be roughly the same ? I'm sorry to be a pest but I'm not 
> clear on that. Thanks and I'm sorry to bother you so much.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> >On Fri, 16 Nov 2007, markleeds at verizon.net wrote:
> >
> >>> From: Prof Brian Ripley <ripley at stats.ox.ac.uk>
> >>> Date: 2007/11/16 Fri AM 09:28:27 CST
> >>> To: Terry Therneau <therneau at mayo.edu>
> >>> Cc: markleeds at verizon.net, r-help at r-project.org
> >>> Subject: Re: [R] alternative to logistic regression
> >>
> >> Thanks to both of you, Terry and Brian for your comments. 
> I'm not sure what I am going to do yet because I don't have 
> enough data yet to explore/
> >> confirm my linear hypothesis but your comments
> >> will help if I go that route.
> >>
> >> I just had one other question since I have you both 
> thinking about GLM's at the moment : Suppose one
> >> is doing logistic or more generally multinomial regression 
> with one predictor. The predictor is quantitative
> >> in the range of [-1,1] but, if I scale it, then
> >> the range becomes whatever it becomes.
> >>
> >> But, there's also the possibility of making the predictor 
> a factor say 
> >> by deciling it and then say letting the deciles be the factors.
> >>
> >> My question is whether would one expect roughly the same 
> probability 
> >> forecasts from two models, one using the numerical 
> predictor and one 
> >> using the factors ?  I imagine that it shouldn't matter so 
> much but I 
> >> have ZERO experience in logistic regression and I'm not 
> confident with 
> >> my current intuition.  Thanks so much for talking about my 
> problem and I 
> >> really appreciate your insights.
> >
> >It's just as in linear regression. If there really is a linear 
> >relationship the predictions will be the same.  But it is 
> quadratic, they 
> >will be very different.  Discreting a numeric explanatory 
> variable is a 
> >common way to look for non-linearity (as in the 'cpus' 
> example studied in 
> >MASS).
> >
> >
> >-- 
> >Brian D. Ripley,                  ripley at stats.ox.ac.uk
> >Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
> >University of Oxford,             Tel:  +44 1865 272861 (self)
> >1 South Parks Road,                     +44 1865 272866 (PA)
> >Oxford OX1 3TG, UK                Fax:  +44 1865 272595
> 
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide 
> http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
> 



More information about the R-help mailing list