Training noise adaptation in attractor neural networks
- 21 February 1990
- journal article
- Published by IOP Publishing in Journal of Physics A: General Physics
- Vol. 23 (4) , L175-L182
- https://doi.org/10.1088/0305-4470/23/4/009
Abstract
The authors consider synaptic neural networks which minimise the output error of the stored patterns when the input patterns are ensembles of their noisy versions with overlap mt with the clean patterns. When mt is infinitesimally less than 1, the network automatically attains maximal stability, confirming the usefulness of training noises in enhancing memory associativity. When mt drops below 1, the field distribution has two bands for large mt, and one continuous band for small mt. Errorless retrieval is impossible for training noises of the order N0. With the increase in training noise, the retrieval overlap deteriorates, although memory associativity does increase for sufficiently low storage.Keywords
This publication has 14 references indexed in Scilit:
- Training with noise and the storage of correlated patterns in a neural network modelJournal of Physics A: General Physics, 1989
- Optimal basins of attraction in randomly sparse neural network modelsJournal of Physics A: General Physics, 1989
- The roles of stability and symmetry in the dynamics of neural networksJournal of Physics A: General Physics, 1988
- Optimal storage properties of neural network modelsJournal of Physics A: General Physics, 1988
- Content-addressability and learning in neural networksJournal of Physics A: General Physics, 1988
- The space of interactions in neural network modelsJournal of Physics A: General Physics, 1988
- Domains of attraction in neural networksJournal de Physique, 1988
- Maximum Storage Capacity in Neural NetworksEurophysics Letters, 1987
- Learning algorithms with optimal stability in neural networksJournal of Physics A: General Physics, 1987
- Learning of correlated patterns in spin-glass networks by local learning rulesPhysical Review Letters, 1987