An optimum NLMS algorithm: performance improvement over LMS
- 1 January 1991
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE)
- p. 2125-2128 vol.3
- https://doi.org/10.1109/icassp.1991.150826
Abstract
The authors prove that for zero-mean white data input, the optimum algorithm which modifies the data vector in the LMS (least mean square) gradient estimate to achieve the lowest excess mean-square error for a given convergence rate is the normalized LMS (NLMS) algorithm. It is shown that this adaptive filtering algorithm is equivalent to recursive least-squares adaptation with a known diagonal data covariance matrix. Moreover, the algorithm can be interpreted as a modified LMS algorithm in which the iterated weight vector is used to form the error estimate. Both theoretical calculations and simulations for white Gaussian data show that this NLMS algorithm performs as much as 3.6 dB better than standard LMS for the input.Keywords
This publication has 3 references indexed in Scilit:
- Optimum error nonlinearities for LMS adaptationPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Behavior of the ε-normalized LMS algorithm with Gaussian inputsIEEE Transactions on Acoustics, Speech, and Signal Processing, 1987
- On the optimum data nonlinearity in LMS adaptationIEEE Transactions on Acoustics, Speech, and Signal Processing, 1986