Slow stochastic Hebbian learning of classes of stimuli in a recurrent neural network
- 1 February 1998
- journal article
- Published by Taylor & Francis in Network: Computation in Neural Systems
- Vol. 9 (1) , 123-152
- https://doi.org/10.1088/0954-898x/9/1/007
Abstract
We study unsupervised Hebbian learning in a recurrent network in which synapses have a finite number of stable states. Stimuli received by the network are drawn at random at each presentation from a set of classes. Each class is defined as a cluster in stimulus space, centred on the class prototype. The presentation protocol is chosen to mimic the protocols of visual memory experiments in which a set of stimuli is presented repeatedly in a random way. The statistics of the input stream may be stationary, or changing. Each stimulus induces, in a stochastic way, transitions between stable synaptic states. Learning dynamics is studied analytically in the slow learning limit, in which a given stimulus has to be presented many times before it is memorized, i.e. before synaptic modifications enable a pattern of activity correlated with the stimulus to become an attractor of the recurrent network. We show that in this limit the synaptic matrix becomes more correlated with the class prototypes than with any of the instances of the class. We also show that the number of classes that can be learned increases sharply when the coding level decreases, and determine the speeds of learning and forgetting of classes in the case of changes in the statistics of the input stream.Keywords
This publication has 25 references indexed in Scilit:
- Long-term depression of excitatory synaptic transmission and its relationship to long-term potentiationPublished by Elsevier ,2003
- Model of global spontaneous activity and local structured activity during delay periods in the cerebral cortexCerebral Cortex, 1997
- Learning internal representations in an attractor neural network with analogue neuronsNetwork: Computation in Neural Systems, 1995
- Electronic implementation of an analogue attractor neural network with stochastic learningNetwork: Computation in Neural Systems, 1995
- Correlations of cortical Hebbian reverberations: theory versus experimentJournal of Neuroscience, 1994
- Learning in Neural Networks with Material SynapsesNeural Computation, 1994
- Dynamics of an attractor neural network converting temporal into spatial correlationsNetwork: Computation in Neural Systems, 1994
- A synaptic model of memory: long-term potentiation in the hippocampusNature, 1993
- Constraints on learning in dynamic synapsesNetwork: Computation in Neural Systems, 1992
- Theory for the development of neuron selectivity: orientation specificity and binocular interaction in visual cortexJournal of Neuroscience, 1982