Anti-Hebbian learning in topologically constrained linear networks: a tutorial
- 1 January 1993
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 4 (5) , 748-761
- https://doi.org/10.1109/72.248453
Abstract
Using standard results from the adaptive signal processing literature, we review the learning behavior of various constrained linear neural networks made up of anti-Hebbian synapses, where learning is driven by the criterion of minimizing the node information energy. We point out how simple learning rules of Hebbian type can provide fast self-organization, under rather wide connectivity constraints. We verify the results of the theory in a set of simulations.Keywords
This publication has 25 references indexed in Scilit:
- THEORY FOR THE DEVELOPMENT OF NEURON SELECTIVITY: ORIENTATION SPECIFICITY AND BINOCULAR INTERACTION IN VISUAL CORTEXPublished by World Scientific Pub Co Pte Ltd ,1995
- Removing Time Variation with the Anti-Hebbian Differential SynapseNeural Computation, 1991
- Learning Invariance from Transformation SequencesNeural Computation, 1991
- Finding structure in timeCognitive Science, 1990
- Gram—Schmidt Neural NetsNeural Computation, 1990
- How to Generate Ordered Maps by Maximizing the Mutual Information between Input and Output SignalsNeural Computation, 1989
- Optimal unsupervised learning in a single-layer linear feedforward neural networkNeural Networks, 1989
- On stochastic approximation of the eigenvectors and eigenvalues of the expectation of a random matrixJournal of Mathematical Analysis and Applications, 1985
- Simplified neuron model as a principal component analyzerJournal of Mathematical Biology, 1982
- Computational aspects of F. L. Bauer's simultaneous iteration methodNumerische Mathematik, 1969