Ridge Regression in Practice
- 1 February 1975
- journal article
- research article
- Published by Taylor & Francis in The American Statistician
- Vol. 29 (1) , 3-20
- https://doi.org/10.1080/00031305.1975.10479105
Abstract
The use of biased estimation in data analysis and model building is discussed. A review of the theory of ridge regression and its relation to generalized inverse regression is presented along with the results of a simulation experiment and three examples of the use of ridge regression in practice. Comments on variable selection procedures, model validation, and ridge and generalized inverse regression computation procedures are included. The examples studied here show that when the predictor variables are highly correlated, ridge regression produces coefficients which predict and extrapolate better than least squares and is a safe procedure for selecting variables.Keywords
This publication has 14 references indexed in Scilit:
- The Relationship Between Variable Selection and Data Agumentation and a Method for PredictionTechnometrics, 1974
- On Biased Estimation in Linear ModelsTechnometrics, 1973
- Combining Experiments to Predict Future Yield Data1Agronomy Journal, 1972
- Generalized Inverses, Ridge Regression, Biased Linear Estimation, and Nonlinear EstimationTechnometrics, 1970
- Ridge Regression: Applications to Nonorthogonal ProblemsTechnometrics, 1970
- Ridge Regression: Biased Estimation for Nonorthogonal ProblemsTechnometrics, 1970
- Combined Analysis of Yield Data from Fertilizer Experiments1Agronomy Journal, 1969
- Computer Aided Design of ExperimentsTechnometrics, 1969
- Selection of Variables for Fitting Equations to DataTechnometrics, 1966
- An Algorithm for Least-Squares Estimation of Nonlinear ParametersJournal of the Society for Industrial and Applied Mathematics, 1963