Abstract
Among the various models proposed so far to account for the properties of neural networks, the one devised by Little and the one derived by Hopfield prove to be the most interesting because they allow the use of statistical mechanics techniques. The link between tween the Hopfield model and the statistical mechanics is provided by the existence of an extensive quantity. When the synaptic plasticity behaves according to a Hebbian procedure, the analogy with the classical spin glass models studied by Van Hemmen is complete. In particular exact solutions describing the steady states of noisy systems are found. On the other hand, the Little model introduces a Markovian dynamics. One shows that the evolution equation obeys the microreversibility principle if the synaptic efficiencies are symmetrical. Therefore, assuming that such a symmetry materializes, the Little model has to obey a Gibbs statistics. The corresponding Hamiltonian is derived accordingly. At last, using these results, both models are shown to display associative memory properties. In particular the storage capacity of neural networks working along with the Little dynamics is similar to the capacity of Hopfield neural networks. The conclusion drawn from the study of the Hopfield model can be extended to the Little model, which is certainly a more realistic description of the biological situation.

This publication has 21 references indexed in Scilit: