[R] difference between lrm's "Model L.R." and anova's "Chi-Square"
Frank E Harrell Jr
f.harrell at vanderbilt.edu
Sun Mar 2 05:15:26 CET 2008
johnson4 at babel.ling.upenn.edu wrote:
> I am running lrm() with a single factor. I then run anova() on the fitted
> model to obtain a p-value associated with having that factor in the model.
>
> I am noticing that the "Model L.R." in the lrm results is almost the same
> as the "Chi-Square" in the anova results, but not quite; the latter value
> is always slightly smaller.
>
> anova() calculates the p-value based on "Chi-Square", but I have
> independent evidence that "Model L.R." is the actual -2*log(LR), so should
> I be using that?
>
> Why are the values different?
anova (anova.Design) computes Wald statistics. When the log-likelihood
is very quadratic, these statistics will be very close to log-likelihood
ratio chi-square statistics. In general LR chi-square tests are better;
we use Wald tests for speed. It's best to take the time and do
lrtest(fit1,fit2) in Design, where one of the two fits is a subset of
the other.
Frank Harrell
>
> prob_a <- inv.logit(rnorm(1,0,1))
> prob_b <- inv.logit(rnorm(1,0,1))
> data <- data.frame(
> factor=c(rep("a",500),rep("b",500)),
> outcome=c(sample(c(1,0),100,replace=T,prob=c(prob_a,1-prob_a)),
> sample(c(1,0),100,replace=T,prob=c(prob_b,1-prob_b))))
> fit <- lrm(outcome~factor,data)
>
> fit # gives "Model L.R." e.g. 8.23, 11.76, 6.89...
> anova(fit) # gives "Chi-Square" e.g. 8.19, 11.69, 6.85...
>
> Previous Next | Save | Delete | Reply |
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>
--
Frank E Harrell Jr Professor and Chair School of Medicine
Department of Biostatistics Vanderbilt University
More information about the R-help
mailing list