Abstract
In ordinary linear regression the ridge estimator of the parameter vector is obtained by adding a small constant k to the diagonal elements of the matrix X'X before inversion. It is wellknown that there exists a k such that the ridge estimator has as smaller mean square error (MSE) than the ordinary least squares estimator. However, the MSE of the ridge estimator depends not only on k and the eigenvalues of X'X but also on the unknown parameters of the model so the optimal k (which minimizes MSE) cannot be calculated in practice. In this paper we suggest a procedure where an “empirical” version of the MSE-func-tion is minimized. The efficiency of our procedure relative to least squares and two other ridge procedures that have been suggested in the literature is studied on some simulated data. The results of the simulations indicate that our procedure is quite efficient, at least if the number of observations is not too small.

This publication has 5 references indexed in Scilit: