Information storage and retrieval in synchronous neural networks

Abstract
Little’s synchronous model for a neural network is studied in the regime when an infinite number of patterns is to be stored. We show that its retrieval capacity may become much larger than in Hopfield’s model, revealing a greater robustness of synchronous update to noise. The phase diagram, including a parameter controlling the occurrence of cycles, exhibits paramagnetic, ferromagnetic, and spin-glass phases together with a line of tricritical points.