[R] nonmonotonic glm?

Stanislav Aggerwal stan.aggerwal at gmail.com
Mon Jan 12 13:45:17 CET 2015


Thanks very much Marc and Ben for the helpful suggestions

Stan

On Sun, Jan 11, 2015 at 10:28 PM, Ben Bolker <bbolker at gmail.com> wrote:

> If you're going to use splines, another possibility is mgcv::gam (also
> part of standard R installation)
>
>   require(mgcv)
>   gam(DV ~ s(IV), data= YourDataFrame, family=binomial)
>
> this has the advantage that the complexity of the spline is
> automatically adjusted/selected by the fitting algorithm (although
> occasionally you need to use s(IV,k=something_bigger) to adjust the
> default *maximum* complexity chosen by the code)
>
>
> On Sun, Jan 11, 2015 at 5:23 PM, Marc Schwartz <marc_schwartz at me.com>
> wrote:
> >
> >> On Jan 11, 2015, at 4:00 PM, Ben Bolker <bbolker at gmail.com> wrote:
> >>
> >> Stanislav Aggerwal <stan.aggerwal <at> gmail.com> writes:
> >>
> >>>
> >>> I have the following problem.
> >>> DV is binomial p
> >>> IV is quantitative variable that goes from negative to positive values.
> >>>
> >>> The data look like this (need nonproportional font to view):
> >>
> >>
> >>  [snip to make gmane happy]
> >>
> >>> If these data were symmetrical about zero,
> >>> I could use abs(IV) and do glm(p
> >>> ~ absIV).
> >>> I suppose I could fit two glms, one to positive and one to negative IV
> >>> values. Seems a rather ugly approach.
> >>>
> >>
> >> [snip]
> >>
> >>
> >>  What's wrong with a GLM with quadratic terms in the predictor variable?
> >>
> >> This is perfectly respectable, well-defined, and easy to implement:
> >>
> >>  glm(y~poly(x,2),family=binomial,data=...)
> >>
> >> or   y~x+I(x^2)  or y~poly(x,2,raw=TRUE)
> >>
> >>> (To complicate things further, this is within-subjects design)
> >>
> >> glmer, glmmPQL, glmmML, etc. should all support this just fine.
> >
> >
> > As an alternative to Ben's recommendation, consider using a piecewise
> cubic spline on the IV. This can be done using glm():
> >
> >   # splines is part of the Base R distribution
> >   # I am using 'df = 5' below, but this can be adjusted up or down as
> may be apropos
> >   require(splines)
> >   glm(DV ~ ns(IV, df = 5), family = binomial, data = YourDataFrame)
> >
> >
> > and as Ben's notes, is more generally supported in mixed models.
> >
> > If this was not mixed model, another logistic regression implementation
> is in Frank's rms package on CRAN, using his lrm() instead of glm() and
> rcs() instead of ns():
> >
> > # after installing rms from CRAN
> > require(rms)
> > lrm(DV ~ rcs(IV, 5), data = YourDataFrame)
> >
> >
> > Regards,
> >
> > Marc Schwartz
> >
> >
>

	[[alternative HTML version deleted]]



More information about the R-help mailing list