Can backpropagation error surface not have local minima
- 1 January 1992
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 3 (6) , 1019-1021
- https://doi.org/10.1109/72.165604
Abstract
It is shown theoretically that for an arbitrary T-element training set with t(t=/<T) different inputs, the backpropagation error surface does not have suboptimal local minima if the network is capable of exactly implementing an arbitrary training set consisting of t different patterns. As a special case, the error surface of a backpropagation network with one hidden layer and t-1 hidden units has no local minima, if the network is trained by an arbitrary T-element set with t different inputs.Keywords
This publication has 12 references indexed in Scilit:
- Local minima and back propagationPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- A simple method to derive bounds on the size and to train multilayer neural networksIEEE Transactions on Neural Networks, 1991
- Approximation theory and feedforward networksNeural Networks, 1991
- Bounds on the number of hidden neurons in multilayer perceptronsIEEE Transactions on Neural Networks, 1991
- Supervised learning techniques for backpropagation networksPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1990
- Parallel recursive prediction error algorithm for training layered neural networksInternational Journal of Control, 1990
- Approximation by superpositions of a sigmoidal functionMathematics of Control, Signals, and Systems, 1989
- Multilayer feedforward networks are universal approximatorsNeural Networks, 1989
- Backpropagation separates when perceptrons doPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1989
- On the capabilities of multilayer perceptronsJournal of Complexity, 1988