A recursive approach to parameter estimation in regression and time series models
- 1 January 1979
- journal article
- research article
- Published by Taylor & Francis in Communications in Statistics - Theory and Methods
- Vol. 8 (12) , 1227-1245
- https://doi.org/10.1080/03610927908827825
Abstract
In this paper we discuss the recursive (or on line) estimation in (i) regression and (ii) autoregressive integrated moving average (ARIMA) time series models. The adopted approach uses Kalman filtering techniques to calculate estimates recursively. This approach is used for the estimation of constant as well as time varying parameters. In the first section of the paper we consider the linear regression model. We discuss recursive estimation both for constant and time varying parameters. For constant parameters, Kalman filtering specializes to recursive least squares. In general, we allow the parameters to vary according to an autoregressive integrated moving average process and update the parameter estimates recursively. Since the stochastic model for the parameter changes will "be rarely known, simplifying assumptions have to be made. In particular we assume a random walk model for the time varying parameters and show how to determine whether the parameters are changing over time. This is illustrated with an example.Keywords
This publication has 6 references indexed in Scilit:
- The exact likelihood function for a mixed autoregressive-moving average processBiometrika, 1974
- The first-order moving average process: Identification, estimation and predictionJournal of Econometrics, 1974
- An Algorithm for Least-Squares Estimation of Nonlinear ParametersJournal of the Society for Industrial and Applied Mathematics, 1963
- New Results in Linear Filtering and Prediction TheoryJournal of Basic Engineering, 1961
- Optimal Properties of Exponentially Weighted ForecastsJournal of the American Statistical Association, 1960
- A New Approach to Linear Filtering and Prediction ProblemsJournal of Basic Engineering, 1960