Abstract
The effects of input signal correlation on the performance of finite precision RLS algorithms is presented. It is shown that one way to analyze finite precision effects is indirectly through the study of the sensitivity of the RLS algorithm to perturbations in the filter coefficients. We show that the mean deviation of the optimum error power grows linearly with time and is the same for both correlated and uncorrelated input samples. However, the variance of the deviation from the optimum increases with signal correlation. Upper and lower bounds are derived in terms of the ratio of the maximum eigenvalue of the sample autocorrelation matrix to the signal variance. (The deviation increases as the signal dynamic range increases.) Simulations are presented which verify the theory. Furthermore. a stable finite precision RLS algorithm is derived by modeling roundoff errors and incorporating their effects into the algorithm.

This publication has 7 references indexed in Scilit: