Storing extensively many weighted patterns in a saturated neural network
- 21 August 1987
- journal article
- Published by IOP Publishing in Journal of Physics A: General Physics
- Vol. 20 (12) , 3989-3999
- https://doi.org/10.1088/0305-4470/20/12/043
Abstract
The performance of the Hopfield model of a neural network with extensively many weighted patterns is analysed. If the system size is N, then N patterns, each provided with a suitable weight, are stored. The weights may be associated with a temporal order and, if appropriately chosen, they allow a gradual fading out of the extensively many stored patterns. Particular emphasis is put on the underlying mathematical structure.Keywords
This publication has 17 references indexed in Scilit:
- Statistical mechanics of neural networks near saturationPublished by Elsevier ,2004
- Nonlinear Neural NetworksPhysical Review Letters, 1986
- Spin glass model of learning by selection.Proceedings of the National Academy of Sciences, 1986
- Storing Infinite Numbers of Patterns in a Spin-Glass Model of Neural NetworksPhysical Review Letters, 1985
- Neurons with graded response have collective computational properties like those of two-state neurons.Proceedings of the National Academy of Sciences, 1984
- Collective properties of neural networks: A statistical physics approachBiological Cybernetics, 1984
- Neural networks and physical systems with emergent collective computational abilities.Proceedings of the National Academy of Sciences, 1982
- Analytic study of the memory storage capacity of a neural networkMathematical Biosciences, 1978
- The existence of persistent states in the brainMathematical Biosciences, 1974
- A logical calculus of the ideas immanent in nervous activityBulletin of Mathematical Biology, 1943