[R] statistics question about a statement in julian faraway's "extending the linear model with R" text
Greg Snow
Greg.Snow at imail.org
Mon Jul 14 23:22:19 CEST 2008
For the binomial the standard link function is the logit:
g(y) = log( y/(1-y) )
In the binomial glm model the observed y values are 0, or 1 which give g(0) = -Inf and g(1) = Inf. Switching to g(mu) with 0 < mu < 1 results in finite values which are much easier for the computer to work with.
Hope this helps,
--
Gregory (Greg) L. Snow Ph.D.
Statistical Data Center
Intermountain Healthcare
greg.snow at imail.org
(801) 408-8111
> -----Original Message-----
> From: r-help-bounces at r-project.org
> [mailto:r-help-bounces at r-project.org] On Behalf Of
> markleeds at verizon.net
> Sent: Monday, July 14, 2008 2:48 PM
> To: r-help at stat.math.ethz.ch
> Subject: [R] statistics question about a statement in julian
> faraway's "extending the linear model with R" text
>
> In Julian Faraway's text on pgs 117-119, he gives a very
> nice, pretty simple description of how a glm can be thought
> of as linear model with non constant variance. I just didn't
> understand one of his statements on the top of 118. To quote :
>
> "We can use a similar idea to fit a GLM. Roughly speaking, we
> want to regress g(y) on X with weights inversely proportional
> to var(g(y). However, g(y) might not make sense in some cases
> - for example in the binomial GLM. So we linearize g(y) as
> follows: Let eta = g(mu) and mu = E(Y). Now do a one step
> expanation , blah, blah, blah.
>
> Could someone explain ( briefly is fine ) what he means by
> g(y) might not make sense in some cases - for example in the
> binomial GLM ?
>
> Thanks.
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
> http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>
More information about the R-help
mailing list