[R] Optimal Y>=q cutoff after logistic regression
Daniel Weitzenfeld
dweitzenfeld at gmail.com
Mon Feb 14 06:31:29 CET 2011
Hi,
I understand that dichotimization of the predicted probabilities after
logistic regression is philosophically questionable, throwing out
information, etc.
But I want to do it anyway. I'd like to include as a measure of fit %
of observations correctly classified because it's measured in units
that non-statisticians can understand more easily than area under the
ROC curve, Dxy, etc.
Am I right that there is an optimal Y>=q probability cutoff, at which
the True Positive Rate is high and the False Positive Rate is low?
Visually, it would be the elbow in the ROC curve, right?
My reasoning is that even if you had a near-perfect model, you could
set a stupidly low (high) cutoff and have a higher false positive
(negative) rate than would be optimal.
I know the standard default or starting point is Y>=.5, but if my
above reasoning is correct, there ought to be an optimal cutoff for a
given model. Is there an easy way to determine that cutoff in R
without writing my own script to iterate through possible breakpoints
and calculating classification accuracy at each one?
Thanks in advance.
-Dan
More information about the R-help
mailing list