Self-Organized Formation of Various Invariant-Feature Filters in the Adaptive-Subspace SOM
- 1 August 1997
- journal article
- Published by MIT Press in Neural Computation
- Vol. 9 (6) , 1321-1344
- https://doi.org/10.1162/neco.1997.9.6.1321
Abstract
The adaptive-subspace self-organizing map (ASSOM) is a modular neural network architecture, the modules of which learn to identify input patterns subject to some simple transformations. The learning process is unsupervised, competitive, and related to that of the traditional SOM (self-organizing map). Each neural module becomes adaptively specific to some restricted class of transformations, and modules close to each other in the network become tuned to similar features in an orderly fashion. If different transformations exist in the input signals, different subsets of ASSOM units become tuned to these transformation classes.Keywords
This publication has 16 references indexed in Scilit:
- Vector quantization with complexity costsIEEE Transactions on Information Theory, 1993
- Learning Invariance from Transformation SequencesNeural Computation, 1991
- The wavelet transform, time-frequency localization and signal analysisIEEE Transactions on Information Theory, 1990
- A Self-Organizing Network for Principal-Component AnalysisEurophysics Letters, 1989
- Uncertainty relation for resolution in space, spatial frequency, and orientation optimized by two-dimensional visual cortical filtersJournal of the Optical Society of America A, 1985
- Self-organized formation of topologically correct feature mapsBiological Cybernetics, 1982
- Mathematical description of the responses of simple cortical cells*Journal of the Optical Society of America, 1980
- Asymptotically optimal block quantizationIEEE Transactions on Information Theory, 1979
- A Stochastic Approximation MethodThe Annals of Mathematical Statistics, 1951
- Analysis of a complex of statistical variables into principal components.Journal of Educational Psychology, 1933