An adaptive d-step ahead predictor based on least squares

Abstract
This paper examines the asymptotic properties of a least squares algorithm for adaptively calculating a d-step ahead prediction of a time series. It is shown that, with probability one, the sample mean square difference between the recursive prediction and the optimal linear prediction converges to zero. Relatively weak assumptions are required regarding the underlying model of the time series.