Spin-glass models of a neural network
- 1 October 1986
- journal article
- research article
- Published by American Physical Society (APS) in Physical Review A
- Vol. 34 (4) , 3435-3445
- https://doi.org/10.1103/physreva.34.3435
Abstract
A general theory of spin-glass-like neural networks with a Monte Carlo dynamics and finitely many attractors (stored patterns) is presented. The long-time behavior of these models is determined by the equilibrium statistical mechanics of certain infinite-range Ising spin glasses, whose thermodynamic stability is analyzed in detail. As special cases we consider the Hopfield and the Little model and show that the free energy of the latter is twice that of the former because of a duplication of spin variables which occurs in the Little model. It is also indicated how metastable states can be partly suppressed or even completely avoided.Keywords
This publication has 20 references indexed in Scilit:
- Scaling laws for the attractors of Hopfield networksJournal de Physique Lettres, 1985
- Information storage and retrieval in spin-glass like neural networksJournal de Physique Lettres, 1985
- Understanding biological computation: reliable learning and recognition.Proceedings of the National Academy of Sciences, 1984
- Neurons with graded response have collective computational properties like those of two-state neurons.Proceedings of the National Academy of Sciences, 1984
- Adaptation and Self-Repair in Parallel Computing StructuresPhysical Review Letters, 1984
- Collective properties of neural networks: A statistical physics approachBiological Cybernetics, 1984
- ‘Unlearning’ has a stabilizing effect in collective memoriesNature, 1983
- Neural networks and physical systems with emergent collective computational abilities.Proceedings of the National Academy of Sciences, 1982
- Analytic study of the memory storage capacity of a neural networkMathematical Biosciences, 1978
- The existence of persistent states in the brainMathematical Biosciences, 1974