Abstract
We consider regression situations for which the response variable is dichotomous. The most common analysis fits successively richer linear logistic models and measures the residual variation from the model by minus twice the maximized log likelihood. General measures of residual variation are considered here, including ordinary squared error and prediction error as well as the log likelihood. All of these are shown to be satisfactory in a certain primitive sense, unlike quantitative regression theory where only squared error is logically satisfactory. The relation of Goodman and Kruskal's measures of categorical association to the theory of penalty functions and probability elicitation is demonstrated.

This publication has 0 references indexed in Scilit: