Abstract
The fast recursive least-squares algorithms are known to exhibit unstable behaviours and sudden divergences, due to round-off noise in finite-precision implementation. This key problem occurs when a forgetting factor is introduced to make the algorithms adaptive. A similar type of divergence is presented and explained in the slow version of the algorithms. It is shown that the early divergence comes from the loss of symmetry of the covariance matrix inverse. The backward estimation reveals to be in fact very sensitive, while the forward estimation does not cause any trouble. It is shown how the fast algorithms tend to create unstable estimated models when time goes on. Based on this remark, a new stabilization method is presented. This method is efficient and does not modify the complexity of the algorithm. Moreover, adaptivity is preserved.

This publication has 5 references indexed in Scilit: