Recurrent neural network training with feedforward complexity
- 1 March 1994
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 5 (2) , 185-197
- https://doi.org/10.1109/72.279184
Abstract
This paper presents a training method that is of no more than feedforward complexity for fully recurrent networks. The method is not approximate, but rather depends on an exact transformation that reveals an embedded feedforward structure in every recurrent network. It turns out that given any unambiguous training data set, such as samples of the state variables and their derivatives, we need only to train this embedded feedforward structure. The necessary recurrent network parameters are then obtained by an inverse transformation that consists only of linear operators. As an example of modeling a representative nonlinear dynamical system, the method is applied to learn Bessel's differential equation, thereby generating Bessel functions within, as well as outside the training set.Keywords
This publication has 19 references indexed in Scilit:
- Universal approximation of an unknown mapping and its derivatives using multilayer feedforward networksPublished by Elsevier ,2003
- Gradient methods for the optimization of dynamical systems containing neural networksIEEE Transactions on Neural Networks, 1991
- Depth-size tradeoffs for neural computationIEEE Transactions on Computers, 1991
- Identification and control of dynamical systems using neural networksIEEE Transactions on Neural Networks, 1990
- Application of adjoint operators to neural learningApplied Mathematics Letters, 1990
- A Learning Algorithm for Continually Running Fully Recurrent Neural NetworksNeural Computation, 1989
- Learning State Space Trajectories in Recurrent Neural NetworksNeural Computation, 1989
- What Size Net Gives Valid Generalization?Neural Computation, 1989
- Feedback-error-learning neural network for trajectory control of a robotic manipulatorNeural Networks, 1988
- Generalization of backpropagation with application to a recurrent gas market modelNeural Networks, 1988