Convergence of a neural network classifier
- 1 January 1990
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE)
- p. 1735-1740 vol.3
- https://doi.org/10.1109/cdc.1990.203918
Abstract
It is shown that the LVQ (learning vector quantization) learning algorithm converges to locally asymptotic stable equilibria of an ordinary differential equation. It is demonstrated that the learning algorithm performs stochastic approximation. Convergence of the Voronoi vectors is guaranteed under the appropriate conditions on the underlying statistics of the classification problem. The authors also present a modification to the learning algorithm which, it is argued, results in convergence of the LVQ for a larger set of initial conditions. Finally, it is shown that the LVQ is a general histogram classifier and that its risk converges to the Bayesian optimal risk as the appropriate parameters go to infinity with the number of past observations.Keywords
This publication has 1 reference indexed in Scilit:
- Nearest neighbor pattern classificationIEEE Transactions on Information Theory, 1967