Abstract
Based on the self-consistent signal-to-noise analysis (SCSNA) capable of dealing with analog neural networks with a wide class of transfer functions, enhancement of the storage capacity of associative memory and the related statistical properties of neural networks are studied for random memory patterns. Two types of transfer functions with the threshold parameter θ are considered, which are derived from the sigmoidal one to represent the output of three-state neurons. Neural networks having a monotonically increasing transfer function FM, FM(u)=sgnu (‖u‖>θ), FM(u)=0 (‖u‖≤θ), are shown to make it impossible for the spin-glass state to coexist with retrieval states in a certain parameter region of θ and α (loading rate of memory patterns), implying the reduction of the number of spurious states. The behavior of the storage capacity with changing θ is qualitatively the same as that of the Ising spin neural networks with varying temperature. On the other hand, the nonmonotonic transfer function FNM, FNM(u)=sgnu (‖u‖<θ), FNM(u)=0 (‖u‖≥θ) gives rise to remarkable features in several respects.