Learning attractors in an asynchronous, stochastic electronic neural network
- 1 May 1998
- journal article
- Published by Taylor & Francis in Network: Computation in Neural Systems
- Vol. 9 (2) , 183-205
- https://doi.org/10.1088/0954-898x/9/2/003
Abstract
LANN27 is an electronic device implementing in discrete electronics a fully connected (full feedback) network of 27 neurons and 351 plastic synapses with stochastic Hebbian learning. Both neurons and synapses are dynamic elements, with two time constants--fast for neurons and slow for synapses. Learning, synaptic dynamics, is analogue and is driven in a Hebbian way by neural activities. Long-term memorization takes place on a discrete set of synaptic efficacies and is effected in a stochastic manner. The intense feedback between the nonlinear neural elements, via the learned synaptic structure, creates in an organic way a set of attractors for the collective retrieval dynamics of the neural system, akin to Hebbian learned reverberations. The resulting structure of the attractors is a record of the large-scale statistics in the uncontrolled, incoming flow of stimuli. As the statistics in the stimulus flow changes significantly, the attractors slowly follow it and the network behaves as a palimpsest--old is gradually replaced by new. Moreover, the slow learning creates attractors which render the network a prototype extractor: entire clouds of stimuli, noisy versions of a prototype, used in training, all retrieve the attractor corresponding to the prototype upon retrieval. Here we describe the process of studying the collective dynamics of the network, before, during and following learning, which is rendered complex by the richness of the possible stimulus streams and the large dimensionality of the space of states of the network. We propose sampling techniques and modes of representation for the outcome.Keywords
This publication has 14 references indexed in Scilit:
- Statistical mechanics of neural networks near saturationPublished by Elsevier ,2004
- Low-power silicon spiking neurons and axonsPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2003
- Slow stochastic Hebbian learning of classes of stimuli in a recurrent neural networkNetwork: Computation in Neural Systems, 1998
- Hebbian Learning of Context in Recurrent Neural NetworksNeural Computation, 1996
- The Hebbian paradigm reintegrated: Local reverberations as internal representationsBehavioral and Brain Sciences, 1995
- Electronic implementation of an analogue attractor neural network with stochastic learningNetwork: Computation in Neural Systems, 1995
- Learning in Neural Networks with Material SynapsesNeural Computation, 1994
- Inferior Temporal Cortex: Where Visual Perception Meets MemoryAnnual Review of Neuroscience, 1993
- Neuronal correlate of visual associative long-term memory in the primate temporal cortexNature, 1988
- Neural networks and physical systems with emergent collective computational abilities.Proceedings of the National Academy of Sciences, 1982