Optimal Bias in Ridge Regression Approaches To Multicollinearity

Abstract
Ridge regression, based on adding a smally quantity, k, to the diagonal of a correlation matrix of highly collinear independent variables, can reduce the error variance of estimators, but at the expense of introducing bias. Because bias is a monotonic increasing function of k, the problem of the appropriate amount of k to introduce as the ridge analysis increment has yet to be resolved This paper proposes a method for selecting the optimal value of k in terms of minimizing the mean square error of estimation. First, we demonstrate mathematically the existence of a minimum mean square error point of the ridge estimator along the scale k. Second, we present an iterative procedure for locating the k value which will minimize the mean square error of estimates for any correlated data set.

This publication has 11 references indexed in Scilit: