Robust Regression Computation Using Iteratively Reweighted Least Squares
- 1 July 1990
- journal article
- Published by Society for Industrial & Applied Mathematics (SIAM) in SIAM Journal on Matrix Analysis and Applications
- Vol. 11 (3) , 466-480
- https://doi.org/10.1137/0611032
Abstract
Several variants of Newton’s method are used to obtain estimates of solution vectors and residual vectors for the linear model $Ax = b + e = b_{true} $ using an iteratively reweighted least squares criterion, which tends to diminish the influence of outliers compared with the standard least squares criterion. Algorithms appropriate for dense and sparse matrices are presented. Solving Newton’s linear system using updated matrix factorizations or the (unpreconditioned) conjugate gradient iteration gives the most effective algorithms. Four weighting functions are compared, and results are given for sparse well-conditioned and ill-conditioned problems.
Keywords
This publication has 21 references indexed in Scilit:
- Sparse matrix test problemsACM Transactions on Mathematical Software, 1989
- Eigenvalues and Condition Numbers of Random MatricesSIAM Journal on Matrix Analysis and Applications, 1988
- A new algorithm for the huber estimator in linear modelsBIT Numerical Mathematics, 1988
- Finite Algorithms for Huber’sM-EstimatorSIAM Journal on Scientific and Statistical Computing, 1986
- Truncated-Newton algorithms for large-scale unconstrained optimizationMathematical Programming, 1983
- A direct method for the solution of sparse linear least squares problemsLinear Algebra and its Applications, 1980
- A System of Subroutines for Iteratively Reweighted Least Squares ComputationsACM Transactions on Mathematical Software, 1980
- Rules and software for detecting rank degeneracyJournal of Econometrics, 1980
- LINPACK Users' GuidePublished by Society for Industrial & Applied Mathematics (SIAM) ,1979
- Algorithms for the Huber estimator in multiple regressionComputing, 1977