Learning and retrieval in attractor neural networks above saturation
- 7 February 1991
- journal article
- Published by IOP Publishing in Journal of Physics A: General Physics
- Vol. 24 (3) , 715-734
- https://doi.org/10.1088/0305-4470/24/3/030
Abstract
The authors investigate neural networks in the range of parameters when the ground-state energy is positive; namely, when a synaptic matrix which satisfies all the desired constraints cannot be found by the learning algorithm. In particular, they calculate the typical distribution functions of local stabilities obtained for a number of algorithms in this region. These functions are used to investigate the retrieval properties as reflected by the size of the basins of attraction. This is done analytically in sparsely connected networks, and numerically in fully connected networks. The main conclusion is that the retrieval behaviour of attractor neural networks can be improved by learning above saturation.Keywords
This publication has 18 references indexed in Scilit:
- Statistical mechanics of neural networks near saturationPublished by Elsevier ,2004
- Retrieval phase diagrams for attractor neural networks with optimal interactionsJournal of Physics A: General Physics, 1990
- Learning in neural network memoriesNetwork: Computation in Neural Systems, 1990
- The AdaTron: An Adaptive Perceptron AlgorithmEurophysics Letters, 1989
- Optimal learning in neural network memoriesJournal of Physics A: General Physics, 1989
- Optimal basins of attraction in randomly sparse neural network modelsJournal of Physics A: General Physics, 1989
- The space of interactions in neural network modelsJournal of Physics A: General Physics, 1988
- An Exactly Solvable Asymmetric Neural Network ModelEurophysics Letters, 1987
- Learning of correlated patterns in spin-glass networks by local learning rulesPhysical Review Letters, 1987
- Storing Infinite Numbers of Patterns in a Spin-Glass Model of Neural NetworksPhysical Review Letters, 1985