A procedure for determination of a good ridge parameter in linear regression
- 1 January 1982
- journal article
- research article
- Published by Taylor & Francis in Communications in Statistics - Simulation and Computation
- Vol. 11 (3) , 285-309
- https://doi.org/10.1080/03610918208812264
Abstract
In ordinary linear regression the ridge estimator of the parameter vector is obtained by adding a small constant k to the diagonal elements of the matrix X'X before inversion. It is wellknown that there exists a k such that the ridge estimator has as smaller mean square error (MSE) than the ordinary least squares estimator. However, the MSE of the ridge estimator depends not only on k and the eigenvalues of X'X but also on the unknown parameters of the model so the optimal k (which minimizes MSE) cannot be calculated in practice. In this paper we suggest a procedure where an “empirical” version of the MSE-func-tion is minimized. The efficiency of our procedure relative to least squares and two other ridge procedures that have been suggested in the literature is studied on some simulated data. The results of the simulations indicate that our procedure is quite efficient, at least if the number of observations is not too small.Keywords
This publication has 5 references indexed in Scilit:
- Ridge Regression and James-Stein Estimation: Review and CommentsTechnometrics, 1979
- A Comparison of Ridge EstimatorsTechnometrics, 1978
- Explicit and Constrained Generalized Ridge EstimationTechnometrics, 1978
- A Monte Carlo Evaluation of Some Ridge-Type EstimatorsJournal of the American Statistical Association, 1975
- Ridge Regression: Biased Estimation for Nonorthogonal ProblemsTechnometrics, 1970