A comparison of some robust, adaptive, and partially adaptive estimators of regression models
- 1 January 1993
- journal article
- research article
- Published by Taylor & Francis in Econometric Reviews
- Vol. 12 (1) , 103-124
- https://doi.org/10.1080/07474939308800255
Abstract
Numerous estimation techniques for regression models have been proposed. These procedures differ in how sample information is used in the estimation procedure. The efficiency of least squares (OLS) estimators implicity assumes normally distributed residuals and is very sensitive to departures from normality, particularly to "outliers" and thick-tailed distributions. Lead absolute deviation (LAD) estimators are less sensitive to outliers and are optimal for laplace random disturbances, but not for normal errors. This paper reports monte carlo comparisons of OLS,LAD, two robust estimators discussed by huber, three partially adaptiveestimators, newey's generalized method of moments estimator, and an adaptive maximum likelihood estimator based on a normal kernal studied by manski. This paper is the first to compare the relative performance of some adaptive robust estimators (partially adaptive and adaptive procedures) with some common nonadaptive robust estimators. The partially adaptive estimators are based on three flxible parametric distributions for the errors. These include the power exponential (Box-Tiao) and generalized t distributions, as well as a distribution for the errors, which is not necessarily symmetric. The adaptive procedures are "fully iterative" rather than one step estimators. The adaptive estimators have desirable large sample properties, but these properties do not necessarily carry over to the small sample case. The monte carlo comparisons of the alternative estimators are based on four different specifications for the error distribution: a normal, a mixture of normals (or variance-contaminated normal), a bimodal mixture of normals, and a lognormal. Five hundred samples of 50 are used. The adaptive and partially adaptive estimators perform very well relative to the other estimation procedures considered, and preliminary results suggest that in some important cases they can perform much better than OLS with 50 to 80% reductions in standard errors.Keywords
This publication has 18 references indexed in Scilit:
- Monte Carlo Evidence on Adaptive Maximum Likelihood Estimation of a RegressionThe Annals of Statistics, 1987
- Adaptive estimation of non–linear regression modelsEconometric Reviews, 1984
- On Adaptive EstimationThe Annals of Statistics, 1982
- Robust methods in econometricsEconometric Reviews, 1982
- Robust StatisticsPublished by Wiley ,1981
- Asymptotic Theory of Least Absolute Error RegressionJournal of the American Statistical Association, 1978
- Least absolute values estimation: an introductionCommunications in Statistics - Simulation and Computation, 1977
- Adaptive Robust Procedures: A Partial Review and Some Suggestions for Future Applications and TheoryJournal of the American Statistical Association, 1974
- A Comparison of the Stable and Student Distributions as Statistical Models for Stock PricesThe Journal of Business, 1974
- Robust Estimation of a Location ParameterThe Annals of Mathematical Statistics, 1964