Information storage and retrieval in synchronous neural networks
- 1 September 1987
- journal article
- research article
- Published by American Physical Society (APS) in Physical Review A
- Vol. 36 (5) , 2475-2477
- https://doi.org/10.1103/physreva.36.2475
Abstract
Little’s synchronous model for a neural network is studied in the regime when an infinite number of patterns is to be stored. We show that its retrieval capacity may become much larger than in Hopfield’s model, revealing a greater robustness of synchronous update to noise. The phase diagram, including a parameter controlling the occurrence of cycles, exhibits paramagnetic, ferromagnetic, and spin-glass phases together with a line of tricritical points.Keywords
This publication has 8 references indexed in Scilit:
- Statistical mechanics of neural networks near saturationPublished by Elsevier ,2004
- Storing Infinite Numbers of Patterns in a Spin-Glass Model of Neural NetworksPhysical Review Letters, 1985
- Spin-glass models of neural networksPhysical Review A, 1985
- Neurons with graded response have collective computational properties like those of two-state neurons.Proceedings of the National Academy of Sciences, 1984
- Collective properties of neural networks: A statistical physics approachBiological Cybernetics, 1984
- Neural networks and physical systems with emergent collective computational abilities.Proceedings of the National Academy of Sciences, 1982
- Analytic study of the memory storage capacity of a neural networkMathematical Biosciences, 1978
- The existence of persistent states in the brainMathematical Biosciences, 1974