Abstract
In this paper it is shown that an ill-conditioned data matrix has similar effects on the parameter estimator when estimating generalized linear models as when estimating linear regression models. Asymptotically, the average length of the maximum likelihood estimator of a parameter vector increases as the conditioning of the covariance matrix deteriorates. A generalization of the ridge regression is suggested for maximum likelihood estimation in generalized linear models. In particular the existence of a ridge coefficient, k, such that the asymptotic mean square error of the generalized linear model ridge estimator is smaller than the asymptotic variance of the maximum likelihood estimator is shown. A numerical example illustrates the theoretical results.

This publication has 13 references indexed in Scilit: