Asymptotics for Least Absolute Deviation Regression Estimators
- 1 June 1991
- journal article
- research article
- Published by Cambridge University Press (CUP) in Econometric Theory
- Vol. 7 (2) , 186-199
- https://doi.org/10.1017/s0266466600004394
Abstract
The LAD estimator of the vector parameter in a linear regression is defined by minimizing the sum of the absolute values of the residuals. This paper provides a direct proof of asymptotic normality for the LAD estimator. The main theorem assumes deterministic carriers. The extension to random carriers includes the case of autoregressions whose error terms have finite second moments. For a first-order autoregression with Cauchy errors the LAD estimator is shown to converge at a 1/n rate.Keywords
This publication has 18 references indexed in Scilit:
- Asymptotic normality ofr-estimates in the linear modelStatistics, 1988
- New Ways to Prove Central Limit TheoremsEconometric Theory, 1985
- Convergence of Stochastic ProcessesPublished by Springer Nature ,1984
- On convergence of LAD estimates in autoregression with infinite varianceJournal of Multivariate Analysis, 1982
- Trimmed Least Squares Estimation in the Linear ModelJournal of the American Statistical Association, 1980
- Trimmed Least Squares Estimation in the Linear ModelJournal of the American Statistical Association, 1980
- Asymptotic Theory of Least Absolute Error RegressionJournal of the American Statistical Association, 1978
- Asymptotic Theory of Least Absolute Error RegressionJournal of the American Statistical Association, 1978
- One-Step Huber Estimates in the Linear ModelJournal of the American Statistical Association, 1975
- One-Step Huber Estimates in the Linear ModelJournal of the American Statistical Association, 1975