Abstract
Networks of ternary neurons storing random vectors over the set (-1,0,1) by the so-called Hebbian rule are considered. It is shown that the maximal number of stored patterns that are equilibrium states of the network with probability tending to one as N tends to infinity is at least on the order of N/sup 2-1// alpha /K, where N is the number of neurons, K is the number of nonzero elements in a pattern. and t= alpha K, 1/2< alpha <1, is the threshold in the neuron function. While. for small K, this bound is similar to that obtained for fully connected binary networks, the number of interneural connections required in the ternary case is considerably smaller. Similar bounds, incorporating error probabilities, are shown to guarantee, in the same probabilistic sense, the correction of errors in the nonzero elements and in the location of these elements.

This publication has 7 references indexed in Scilit: