Abstract
An upper bound for the density of the metastable states of neural networks consisting of analog neurons with graded response is calculated exactly in the limit of infinitely many number of neurons. It is found that the density of the metastable states that are uncorrelated to any memory patterns is remarkably suppressed compared with the case of deterministic neural networks of formal neurons. The result gives a theoretical explanation for the belief that the neural networks of nonlinear analog neurons are very effective in computation. The critical storage capacity is obtained as a function of the analog gain, which should correspond to the ‘‘temperature’’ in stochastic Ising-spin neural networks, exhibiting a small increase compared with the case of the stochastic neural networks.

This publication has 20 references indexed in Scilit: