Learning in neural networks with local minima
- 1 October 1992
- journal article
- research article
- Published by American Physical Society (APS) in Physical Review A
- Vol. 46 (8) , 5221-5231
- https://doi.org/10.1103/physreva.46.5221
Abstract
An attempt is made to study learning in neural networks with local minima. For small learning parameters η, the transition time from one mimimum to another is asymptotically given by exp(η̃/η), with η̃, a constant independent of η, called the reference learning parameter. A general scheme to calculate the reference learning parameter is presented. This scheme is valid for a large class of learning rules.Keywords
This publication has 12 references indexed in Scilit:
- Learning-parameter adjustment in neural networksPhysical Review A, 1992
- Learning processes in neural networksPhysical Review A, 1991
- Convergence of learning algorithms with constant learning ratesIEEE Transactions on Neural Networks, 1991
- A convergence theorem for Grossberg learningNeural Networks, 1990
- Convergence properties of Kohonen's topology conserving maps: fluctuations, stability, and dimension selectionBiological Cybernetics, 1988
- Learning representations by back-propagating errorsNature, 1986
- Simplified neuron model as a principal component analyzerJournal of Mathematical Biology, 1982
- Neural networks and physical systems with emergent collective computational abilities.Proceedings of the National Academy of Sciences, 1982
- Self-organized formation of topologically correct feature mapsBiological Cybernetics, 1982
- On the Relation between Master Equations and Random Walks and Their SolutionsJournal of Mathematical Physics, 1971