Optimally adapted attractor neural networks in the presence of noise

Abstract
By adapting an attractor neural network to an appropriate training overlap, the authors optimize its attractor overlap, and subsequently the storage capacity, when retrieval noise (temperature) is present in the system. The training overlap is determined self-consistently by the optimal attractor overlap. The phase diagram of the optimal attractor overlap in the temperature-storage space is found. A novel co-existence phase of strong and weak retrievers is present. The maximum storage capacity deviates from the storage capacity of the maximally stable network on increasing temperature, and in the high-temperature regime (T>or=0.38 for Gaussian noise), the Hopfield network yields the maximum storage capacity. This analysis demonstrates the principles of specialization and adaptation in neural networks.

This publication has 13 references indexed in Scilit: