Adaptive bidirectional associative memories
- 1 December 1987
- journal article
- Published by Optica Publishing Group in Applied Optics
- Vol. 26 (23) , 4947-4960
- https://doi.org/10.1364/ao.26.004947
Abstract
Bidirectionality, forward and backward information flow, is introduced in neural networks to produce two-way associative search for stored stimulus-response associations (Ai,Bi). Two fields of neurons, FA and FB, are connected by an n × p synaptic marix M. Passing information through M gives one direction, passing information through its transpose MT gives the other. Every matrix is bidirectionally stable for bivalent and for continuous neurons. Paired data (Ai,Bi) are encoded in M by summing bipolar correlation matrices. The bidirectional associative memory (BAM) behaves as a two-layer hierarchy of symmetrically connected neurons. When the neurons in FA and FB are activated, the network quickly evolves to a stable state of two-pattern reverberation, or pseudoadaptive resonance, for every connection topology M. The stable reverberation corresponds to a system energy local minimum. An adaptive BAM allows M to rapidly learn associations without supervision. Stable short-term memory reverberations across FA and FB gradually seep pattern information into the long-term memory connections M, allowing input associations (Ai,Bi) to dig their own energy wells in the network state space. The BAM correlation encoding scheme is extended to a general Hebbian learning law. Then every BAM adaptively resonates in the sense that all nodes and edges quickly equilibrate in a system energy local minimum. A sampling adaptive BAM results when many more training samples are presented than there are neurons in FA and FB, but presented for brief pulses of learning, not allowing learning to fully or nearly converge. Learning tends to improve with sample size. Sampling adaptive BAMs can learn some simple continuous mappings and can rapidly abstract bivalent associations from several noisy gray-scale samples.Keywords
This publication has 11 references indexed in Scilit:
- A massively parallel architecture for a self-organizing neural pattern recognition machinePublished by Elsevier ,2005
- Fuzzy entropy and conditioningInformation Sciences, 1986
- Absolute stability of global pattern formation and parallel memory storage by competitive neural networksIEEE Transactions on Systems, Man, and Cybernetics, 1983
- Neural networks and physical systems with emergent collective computational abilities.Proceedings of the National Academy of Sciences, 1982
- How does a brain build a cognitive code?Psychological Review, 1980
- Distinctive features, categorical perception, and probability learning: Some applications of a neural model.Psychological Review, 1977
- Adaptive pattern classification and universal recoding: I. Parallel development and coding of neural feature detectorsBiological Cybernetics, 1976
- Contour Enhancement, Short Term Memory, and Constancies in Reverberating Neural NetworksStudies in Applied Mathematics, 1973
- Correlation Matrix MemoriesIEEE Transactions on Computers, 1972
- A logical calculus of the ideas immanent in nervous activityBulletin of Mathematical Biology, 1943