Learning algorithms and fixed dynamics

Abstract
The authors discuss the equivalence of learning algorithms and nonlinear dynamic systems whose differential equations have fixed coefficients. They show how backpropagation transforms into a fixed-weight recursive neural network suitable for VLSI or optical implementations. The transformation is quite general and implies that understanding physiological networks may require one to determine the values of fixed parameters distributed throughout a network. Equivalently, a particular synaptic weight update mechanism such as Hebbian learning could likely be used to implement many known learning algorithms. The authors use the transformation process to illustrate why a network whose only variable weights are hidden-layer thresholds is capable of universal approximation.

This publication has 6 references indexed in Scilit: