Retrieval phase diagrams for attractor neural networks with optimal interactions

Abstract
The authors consider the retrieval properties of attractor neural networks whose synaptic matrices have been constructed to maximise the number of patterns which can be stored in a perceptron satisfying certain constraints. Retrieval is studied in the absence as well as in the presence of fast noise (temperature). The discussion is restricted to dilute networks, for which dynamical equations for the overlaps are available. When the patterns are stored with a prescribed lower limit on the stability parameter on every site, the full stability of the perceptron ensures the existence of an attractor with perfect retrieval. It is found that the curve of critical storage capacity ( alpha ) against temperature (T) as a line of first-order transitions for high values of alpha and becomes second order for low alpha , at a point of a tricritical nature. The phase diagram is compared with the dilute Hopfield model. It is found that at high synaptic noise levels the diluted Hopfield net stores more effectively than the network trained for optimal perceptron storage. When a given fraction of sites is allowed to violate the stability bound, the solution of the perceptron 'learning' problem does not ensure the existence of an attractor of finite overlap even in the absence of noise. This case is studied separately for T=0 and for finite T.

This publication has 19 references indexed in Scilit: