Regression and autoregression with infinite variance
- 1 December 1974
- journal article
- Published by Cambridge University Press (CUP) in Advances in Applied Probability
- Vol. 6 (4) , 768-783
- https://doi.org/10.2307/1426192
Abstract
The theory of the linear model is incomplete in that it fails to deal with variables possessing infinite variance. To fill an important part of this gap, we give an unbiased estimate, the “screened ratio estimate”, for λ in the regression E(X|Z) = λX; X and Z are linear combinations of independent, identically distributed symmetric random variables that are either stable or asymptotically Pareto distributed of index α ≤ 2. By way of comparison, the usual least squares estimate of λ is shown not to converge in general to any constant when α < 2. However, in the autoregression Xn = a1Xn-1 + … + akXn-k + Un, the least squares estimates are shown to be consistent as long as the roots of 1 - a1x2 - a2x2 - … - akxk = 0 are outside the complex unit circle, Xn is independent of Un+j,j ≥ 1, and the Un are independent and identically distributed and in the domain of attraction of a stable law of index a ≤ 2. Finally, the consistency of least squares estimates for finite moving averages is established.Keywords
This publication has 6 references indexed in Scilit:
- Stable Densities Under Change of Scale and Total Variation InequalitiesThe Annals of Probability, 1975
- The L p Norm of Sums of Translates of a FunctionTransactions of the American Mathematical Society, 1973
- Linear sample spaces and stable processesJournal of Functional Analysis, 1972
- Regression with Non-Gaussian Stable Disturbances: Some Sampling ResultsEconometrica, 1971
- Establishing the Positive Definiteness of the Sample Covariance MatrixThe Annals of Mathematical Statistics, 1970
- The Variation of Certain Speculative PricesThe Journal of Business, 1963