Convergence and performance analysis of the normalized LMS algorithm with uncorrelated Gaussian data
- 1 July 1988
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 34 (4) , 680-691
- https://doi.org/10.1109/18.9768
Abstract
It is demonstrated that the normalized least mean square (NLMS) algorithm can be viewed as a modification of the widely used LMS algorithm. The NLMS is shown to have an important advantage over the LMS, which is that its convergence is independent of environmental changes. In addition, the authors present a comprehensive study of the first and second-order behavior in the NLMS algorithm. They show that the NLMS algorithm exhibits significant improvement over the LMS algorithm in convergence rate, while its steady-state performance is considerably worseKeywords
This publication has 6 references indexed in Scilit:
- Analysis of the normalized LMS algorithm with Gaussian inputsIEEE Transactions on Acoustics, Speech, and Signal Processing, 1986
- Convergence analysis of LMS filters with uncorrelated Gaussian dataIEEE Transactions on Acoustics, Speech, and Signal Processing, 1985
- Learning characteristics of stochastic-gradient-descent algorithms: A general study, analysis, and critiqueSignal Processing, 1984
- Performance advantage of complex LMS for controlling narrow-band adaptive arraysIEEE Transactions on Acoustics, Speech, and Signal Processing, 1981
- Performance of adaptive estimation algorithms in dependent random environmentsIEEE Transactions on Automatic Control, 1980
- A learning method for system identificationIEEE Transactions on Automatic Control, 1967