Optimal convergence of on-line backpropagation
- 1 January 1996
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 7 (1) , 251-254
- https://doi.org/10.1109/72.478415
Abstract
Many researchers are quite skeptical about the actual behavior of neural network learning algorithms like backpropagation. One of the major problems is with the lack of clear theoretical results on optimal convergence, particularly for pattern mode algorithms. In this paper, we prove the companion of Rosenblatt's PC (perceptron convergence) theorem for feedforward networks (1960), stating that pattern mode backpropagation converges to an optimal solution for linearly separable patterns.Keywords
This publication has 10 references indexed in Scilit:
- Learning without local minima in radial basis function networksIEEE Transactions on Neural Networks, 1995
- An accelerated learning algorithm for multilayer perceptron networksIEEE Transactions on Neural Networks, 1994
- On the problem of local minima in recurrent neural networksIEEE Transactions on Neural Networks, 1994
- On the Geometry of Feedforward Neural Network Error SurfacesNeural Computation, 1993
- Can backpropagation error surface not have local minimaIEEE Transactions on Neural Networks, 1992
- On the problem of local minima in backpropagationPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1992
- Terminal attractor learning algorithms for back propagation neural networksPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1991
- Neural networks and principal component analysis: Learning from examples without local minimaNeural Networks, 1989
- Backpropagation separates when perceptrons doPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1989
- Parallel Distributed ProcessingPublished by MIT Press ,1986