Removing Time Variation with the Anti-Hebbian Differential Synapse
- 1 September 1991
- journal article
- Published by MIT Press in Neural Computation
- Vol. 3 (3) , 312-320
- https://doi.org/10.1162/neco.1991.3.3.312
Abstract
I describe a local synaptic learning rule that can be used to remove the effects of certain types of systematic temporal variation in the inputs to a unit. According to this rule, changes in synaptic weight result from a conjunction of short-term temporal changes in the inputs and the output. Formally,This is like the differential rule proposed by Klopf (1986) and Kosko (1986), except for a change of sign, which gives it an anti-Hebbian character. By itself this rule is insufficient. A weight conservation condition is needed to prevent the weights from collapsing to zero, and some further constraint—implemented here by a biasing term—to select particular sets of weights from the subspace of those which give minimal variation. As an example, I show that this rule will generate center-surround receptive fields that remove temporally varying linear gradients from the inputs.Keywords
This publication has 11 references indexed in Scilit:
- Learning Invariance from Transformation SequencesNeural Computation, 1991
- Connectionist learning proceduresArtificial Intelligence, 1989
- Unsupervised LearningNeural Computation, 1989
- Synthesizing a Color Algorithm from ExamplesScience, 1988
- Synthesizing a color algorithm from examplesScience, 1988
- Learning representations by back-propagating errorsNature, 1986
- Hebbian synapses in hippocampus.Proceedings of the National Academy of Sciences, 1986
- Hippocampal long‐term potentiation is induced by pairing single afferent volleys with intracellular^ injected depolarizing current pulsesActa Physiologica Scandinavica, 1986
- A drive-reinforcement model of single neuron function: An alternative to the Hebbian neuronal modelAIP Conference Proceedings, 1986
- Differential Hebbian learningAIP Conference Proceedings, 1986