An analysis of the GLVQ algorithm
- 1 July 1995
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 6 (4) , 1012-1016
- https://doi.org/10.1109/72.392266
Abstract
Generalized learning vector quantization (GLVQ) has been proposed in as a generalization of the simple competitive learning (SCL) algorithm. The main argument of GLVQ proposal is its superior insensitivity to the initial values of the weights (code vectors). In this paper we show that the distinctive characteristics of the definition of GLVQ disappear outside a small domain of applications. GLVQ becomes identical to SCL when either the number of code vectors grows or the size of the input space is large. Besides that, the behavior of GLVQ is inconsistent for problems defined on very small scale input spaces. The adaptation rules fluctuate between performing descent and ascent searches on the gradient of the distortion functionKeywords
This publication has 3 references indexed in Scilit:
- Generalized clustering networks and Kohonen's self-organizing schemeIEEE Transactions on Neural Networks, 1993
- Competitive learning and soft competition for vector quantizer designIEEE Transactions on Signal Processing, 1992
- Self-Organization and Associative MemoryPublished by Springer Nature ,1989