On the capacity of ternary Hebbian networks
- 1 May 1991
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 37 (3) , 528-534
- https://doi.org/10.1109/18.79908
Abstract
Networks of ternary neurons storing random vectors over the set (-1,0,1) by the so-called Hebbian rule are considered. It is shown that the maximal number of stored patterns that are equilibrium states of the network with probability tending to one as N tends to infinity is at least on the order of N/sup 2-1// alpha /K, where N is the number of neurons, K is the number of nonzero elements in a pattern. and t= alpha K, 1/2< alpha <1, is the threshold in the neuron function. While. for small K, this bound is similar to that obtained for fully connected binary networks, the number of interneural connections required in the ternary case is considerably smaller. Similar bounds, incorporating error probabilities, are shown to guarantee, in the same probabilistic sense, the correction of errors in the nonzero elements and in the location of these elements.Keywords
This publication has 7 references indexed in Scilit:
- On the capacity of associative memories with linear threshold functionsIEEE Transactions on Information Theory, 1989
- Information capacity of associative memoriesIEEE Transactions on Information Theory, 1989
- Dynamics and statistical mechanics of the Hopfield modelJournal of Physics A: General Physics, 1987
- The capacity of the Hopfield associative memoryIEEE Transactions on Information Theory, 1987
- Scaling laws for the attractors of Hopfield networksJournal de Physique Lettres, 1985
- Neural networks and physical systems with emergent collective computational abilities.Proceedings of the National Academy of Sciences, 1982
- Sums of Independent Random VariablesPublished by Springer Nature ,1975