Abstract
As yet no efficiently computable algorithm for one step nonlinear prediction has been proposed for any general class of stationary processes which performs strictly better than the optimal linear predictor. In this paper it is shown that for the class of stationary moving average processes the improvement obtained by optimal nonlinear prediction versus optimal linear prediction is bounded by a constant which depends only on the distribution of the independent and identically distributed random variables $Y_j$ used to form the moving average process $X_n = \sum a_jY_{n - j}$.

This publication has 0 references indexed in Scilit: