Comments on "Can backpropagation error surface not have local minima?".
- 1 September 1994
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 5 (5) , 844-845
- https://doi.org/10.1109/72.317738
Abstract
In the above paper Yu (IEEE Trans. Neural Networks, vol.3, no.6, p.1019-21 (1992)) claims to prove that local minima do not exist in the error surface of backpropagation networks being trained on data with t distinct input patterns when the network is capable of exactly representing arbitrary mappings on t input patterns. The commenter points out that the proof presented is flawed, so that the resulting claims remain unproved. In reply, Yu points out that the undesired phenomenon that was sited can be avoided by simply imposing the arbitrary mapping capacity of the network on lemma 1 in the article.Keywords
This publication has 1 reference indexed in Scilit:
- A simple method to derive bounds on the size and to train multilayer neural networksIEEE Transactions on Neural Networks, 1991