Neural networks with nonlinear synapses and a static noise
- 1 September 1986
- journal article
- research article
- Published by American Physical Society (APS) in Physical Review A
- Vol. 34 (3) , 2571-2574
- https://doi.org/10.1103/physreva.34.2571
Abstract
The theory of neural networks is extended to include a static noise as well as nonlinear updating of synapses by learning. The noise appears either in the form of spin-glass interactions, which are independent of the learning process, or as a random decaying of synapses. In an unsaturated network, the nonlinear learning algorithms may modify the energy surface and lead to interesting new computational capabilities. Close to saturation, they act as an additional source of a static noise. The effect of the noise on memory storage is calculated.Keywords
This publication has 5 references indexed in Scilit:
- Storing Infinite Numbers of Patterns in a Spin-Glass Model of Neural NetworksPhysical Review Letters, 1985
- Spin-glass models of neural networksPhysical Review A, 1985
- Neurons with graded response have collective computational properties like those of two-state neurons.Proceedings of the National Academy of Sciences, 1984
- Neural networks and physical systems with emergent collective computational abilities.Proceedings of the National Academy of Sciences, 1982
- Infinite-ranged models of spin-glassesPhysical Review B, 1978