Generalization of back-propagation to recurrent neural networks
- 9 November 1987
- journal article
- research article
- Published by American Physical Society (APS) in Physical Review Letters
- Vol. 59 (19) , 2229-2232
- https://doi.org/10.1103/physrevlett.59.2229
Abstract
An adaptive neural network with asymmetric connections is introduced. This network is related to the Hopfield network with graded neurons and uses a recurrent generalization of the δ rule of Rumelhart, Hinton, and Williams to modify adaptively the synaptic weights. The new network bears a resemblance to the master/slave network of Lapedes and Farber but it is architecturally simpler.Keywords
This publication has 3 references indexed in Scilit:
- Parallel Distributed ProcessingPublished by MIT Press ,1986
- Neurons with graded response have collective computational properties like those of two-state neurons.Proceedings of the National Academy of Sciences, 1984
- Characteristics of Random Nets of Analog Neuron-Like ElementsIEEE Transactions on Systems, Man, and Cybernetics, 1972