Training noise adaptation in attractor neural networks

Abstract
The authors consider synaptic neural networks which minimise the output error of the stored patterns when the input patterns are ensembles of their noisy versions with overlap mt with the clean patterns. When mt is infinitesimally less than 1, the network automatically attains maximal stability, confirming the usefulness of training noises in enhancing memory associativity. When mt drops below 1, the field distribution has two bands for large mt, and one continuous band for small mt. Errorless retrieval is impossible for training noises of the order N0. With the increase in training noise, the retrieval overlap deteriorates, although memory associativity does increase for sufficiently low storage.

This publication has 14 references indexed in Scilit: