Memorizing binary vector sequences by a sparsely encoded network
- 1 January 1994
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 5 (6) , 974-981
- https://doi.org/10.1109/72.329695
Abstract
We present a neural network employing Hebbian storage and sparse internal coding, which is capable of memorizing and correcting sequences of binary vectors by association. A ternary version of the Kanerva memory, folded into a feedback configuration, is shown to perform the basic sequence memorization and regeneration function. The inclusion of lateral connections between the internal cells increases the network capacity considerably and facilitates the correction of individual input patterns and the detection of large errors. The introduction of higher delays in the transmission lines between the external input-output layer and the internal memory layer is shown to further improve the network's error correction capability.Keywords
This publication has 7 references indexed in Scilit:
- Corrective memory by a symmetric sparsely encoded networkIEEE Transactions on Information Theory, 1994
- Lower bounds on the capacities of binary and ternary networks storing sparse random vectorsIEEE Transactions on Information Theory, 1992
- Implementing a “Sense of Time” via Entropy in Associative MemoriesPublished by Elsevier ,1991
- Optimal pruning with applications to tree-structured source coding and modelingIEEE Transactions on Information Theory, 1989
- Do most binary linear codes achieve the Goblick bound on the covering radius? (Corresp.)IEEE Transactions on Information Theory, 1986
- Neural networks and physical systems with emergent collective computational abilities.Proceedings of the National Academy of Sciences, 1982
- Learning Patterns and Pattern Sequences by Self-Organizing Nets of Threshold ElementsIEEE Transactions on Computers, 1972