Redistribution of Synaptic Efficacy Supports Stable Pattern Learning in Neural Networks
- 1 April 2002
- journal article
- Published by MIT Press in Neural Computation
- Vol. 14 (4) , 873-888
- https://doi.org/10.1162/089976602317318992
Abstract
Markram and Tsodyks, by showing that the elevated synaptic efficacy observed with single-pulse long-term potentiation (LTP) measurements disappears with higher-frequency test pulses, have critically challenged the conventional assumption that LTP reflects a general gain increase. This observed change in frequency dependence during synaptic potentiation is called redistribution of synaptic efficacy (RSE). RSE is here seen as the local realization of a global design principle in a neural network for pattern coding. The underlying computational model posits an adaptive threshold rather than a multiplicative weight as the elementary unit of long-term memory. A distributed instar learning law allows thresholds to increase only monotonically, but adaptation has a bidirectional effect on the model postsynaptic potential. At each synapse, threshold increases implement pattern selectivity via a frequency-dependent signal component, while a complementary frequency-independent component nonspecifically strengthens the path. This synaptic balance produces changes in frequency dependence that are robustly similar to those observed by Markram and Tsodyks. The network design therefore suggests a functional purpose for RSE, which, by helping to bound total memory change, supports a distributed coding scheme that is stable with fast as well as slow learning. Multiplicative weights have served as a cornerstone for models of physiological data and neural systems for decades. Although the model discussed here does not implement detailed physiology of synaptic transmission, its new learning laws operate in a network architecture that suggests how recently discovered synaptic computations such as RSE may help produce new network capabilities such as learning that is fast, stable, and distributed.Keywords
This publication has 19 references indexed in Scilit:
- A massively parallel architecture for a self-organizing neural pattern recognition machinePublished by Elsevier ,2005
- Neural-network models of learning and memory: leading questions and an emerging frameworkTrends in Cognitive Sciences, 2001
- Distributed Learning, Recognition, and Prediction by ART and ARTMAP Neural NetworksNeural Networks, 1997
- Synaptic Depression and Cortical Gain ControlScience, 1997
- A distributed outstar network for spatial pattern learningNeural Networks, 1994
- Normal and amnesic learning, recognition and memory by a neural model of cortico-hippocampal interactionsTrends in Neurosciences, 1993
- Fuzzy ARTMAP: A neural network architecture for incremental supervised learning of analog multidimensional mapsIEEE Transactions on Neural Networks, 1992
- Fuzzy ART: Fast stable learning and categorization of analog patterns by an adaptive resonance systemNeural Networks, 1991
- ARTMAP: Supervised real-time learning and classification of nonstationary data by a self-organizing neural networkNeural Networks, 1991
- ART 3: Hierarchical search using chemical transmitters in self-organizing pattern recognition architecturesNeural Networks, 1990