Estimation of the Covariance Matrix of the Least-Squares Regression Coefficients When the Disturbance Covariance Matrix Is of Unknown Form
- 1 March 1991
- journal article
- research article
- Published by Cambridge University Press (CUP) in Econometric Theory
- Vol. 7 (1) , 22-45
- https://doi.org/10.1017/s0266466600004229
Abstract
This paper deals with the problem of estimating the covariance matrix of the least-squares regression coefficients under heteroskedasticity and/or autocorrelation of unknown form. We consider an estimator proposed by White [17] and give a relatively simple proof of its consistency. Our proof is based on more easily verifiable conditions than those of White. An alternative estimator with improved small sample properties is also presented.This publication has 11 references indexed in Scilit:
- A Simple, Positive Semi-Definite, Heteroskedasticity and Autocorrelation Consistent Covariance MatrixEconometrica, 1987
- Some heteroskedasticity-consistent covariance matrix estimators with improved finite sample propertiesJournal of Econometrics, 1985
- Nonlinear Regression with Dependent ObservationsEconometrica, 1984
- More Efficient Estimation in the Presence of Heteroscedasticity of Unknown FormEconometrica, 1983
- Conditions for linear processes to be strong-mixingProbability Theory and Related Fields, 1981
- A Heteroskedasticity-Consistent Covariance Matrix Estimator and a Direct Test for HeteroskedasticityEconometrica, 1980
- Jackknifing in Unbalanced SituationsTechnometrics, 1977
- Jackknifing in Unbalanced SituationsTechnometrics, 1977
- Bounds on the Variance of Regression Coefficients Due to Heteroscedastic or Autoregressive ErrorsEconometrica, 1974
- Asymptotic Normality and Consistency of the Least Squares Estimators for Families of Linear RegressionsThe Annals of Mathematical Statistics, 1963