Comparative analysis of backpropagation and the extended Kalman filter for training multilayer perceptrons
- 1 June 1992
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE)
- Vol. 14 (6) , 686-691
- https://doi.org/10.1109/34.141559
Abstract
The relationship between backpropagation and extended Kalman filtering for training multilayer perceptrons is examined. These two techniques are compared theoretically and empirically using sensor imagery. Backpropagation is a technique from neural networks for assigning weights in a multilayer perceptron. An extended Kalman filter can also be used for this purpose. A brief review of the multilayer perceptron and these two training methods is provided. Then, it is shown that backpropagation is a degenerate form of the extended Kalman filter. The training rules are compared in two examples: an image classification problem using laser radar Doppler imagery and a target detection problem using absolute range images. In both examples, the backpropagation training algorithm is shown to be three orders of magnitude less costly than the extended Kalman filter algorithm in terms of a number of floating-point operations.Keywords
This publication has 7 references indexed in Scilit:
- Bounds on the number of hidden neurons in multilayer perceptronsIEEE Transactions on Neural Networks, 1991
- Approximation by superpositions of a sigmoidal functionMathematics of Control, Signals, and Systems, 1989
- Learning in Artificial Neural Networks: A Statistical PerspectiveNeural Computation, 1989
- Multilayer feedforward networks are universal approximatorsNeural Networks, 1989
- Increased rates of convergence through learning rate adaptationNeural Networks, 1988
- Parallel Distributed ProcessingPublished by MIT Press ,1986
- A Stochastic Approximation MethodThe Annals of Mathematical Statistics, 1951