Learning algorithms and fixed dynamics
- 9 December 2002
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE)
- Vol. i, 799-801
- https://doi.org/10.1109/ijcnn.1991.155280
Abstract
The authors discuss the equivalence of learning algorithms and nonlinear dynamic systems whose differential equations have fixed coefficients. They show how backpropagation transforms into a fixed-weight recursive neural network suitable for VLSI or optical implementations. The transformation is quite general and implies that understanding physiological networks may require one to determine the values of fixed parameters distributed throughout a network. Equivalently, a particular synaptic weight update mechanism such as Hebbian learning could likely be used to implement many known learning algorithms. The authors use the transformation process to illustrate why a network whose only variable weights are hidden-layer thresholds is capable of universal approximation.Keywords
This publication has 6 references indexed in Scilit:
- New strategies for improving speech enhancementInternational Journal of Bio-Medical Computing, 1990
- The Stone-Weierstrass theorem and its application to neural networksIEEE Transactions on Neural Networks, 1990
- Multilayer feedforward networks are universal approximatorsNeural Networks, 1989
- Dynamic systems and the “subsymbolic level”Behavioral and Brain Sciences, 1988
- Parallel Distributed ProcessingPublished by MIT Press ,1986
- Vacuum arc recovery phenomenaProceedings of the IEEE, 1964