Training with noise and the storage of correlated patterns in a neural network model
- 21 June 1989
- journal article
- Published by IOP Publishing in Journal of Physics A: General Physics
- Vol. 22 (12) , 2019-2030
- https://doi.org/10.1088/0305-4470/22/12/007
Abstract
Local iterative learning algorithms for the interactions between Ising spins in neural network models are discussed. They converge to solutions with basins of attraction whose shape is determined by the noise in the training data, provided such solutions exist. The training is applied both to the storage of random patterns and to a model for the storage of correlated words. The existence of correlations increases the storage capacity of a given network beyond that for random patterns. The model can be modified to store cycles of patterns and in particular is applied to the storage of continuous items of English text.Keywords
This publication has 25 references indexed in Scilit:
- Statistical mechanics of neural networks near saturationPublished by Elsevier ,2004
- Content-addressability and learning in neural networksJournal of Physics A: General Physics, 1988
- Hierarchical associative networksJournal of Physics A: General Physics, 1987
- Dynamics and statistical mechanics of the Hopfield modelJournal of Physics A: General Physics, 1987
- Learning of correlated patterns in spin-glass networks by local learning rulesPhysical Review Letters, 1987
- Number of stable points for spin-glasses and neural networks of higher ordersPhysical Review Letters, 1987
- Structure of metastable states in the Hopfield modelJournal of Physics A: General Physics, 1986
- Learning and memory properties in fully connected networksAIP Conference Proceedings, 1986
- Storing Infinite Numbers of Patterns in a Spin-Glass Model of Neural NetworksPhysical Review Letters, 1985
- Spin-glass models of neural networksPhysical Review A, 1985