Enhanced training algorithms, and integrated training/architecture selection for multilayer perceptron networks
- 1 January 1992
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 3 (6) , 864-875
- https://doi.org/10.1109/72.165589
Abstract
The standard backpropagation-based multilayer perceptron training algorithm suffers from a slow asymptotic convergence rate. Sophisticated nonlinear least-squares and quasi-Newton optimization techniques are used to construct enhanced multilayer perceptron training algorithms, which are then compared to the backpropagation algorithm in the context of several example problems. In addition, an integrated approach to training and architecture selection that uses the described enhanced algorithms is presented, and its effectiveness illustrated in the context of synthetic and actual pattern recognition problems.Keywords
This publication has 8 references indexed in Scilit:
- Learning in Artificial Neural Networks: A Statistical PerspectiveNeural Computation, 1989
- Some Asymptotic Results for Learning in Single Hidden-Layer Feedforward Network ModelsJournal of the American Statistical Association, 1989
- Multilayer feedforward networks are universal approximatorsNeural Networks, 1989
- Learning of word stress in a sub-optimal second order back-propagation neural networkPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1988
- Universal coding, information, prediction, and estimationIEEE Transactions on Information Theory, 1984
- An Adaptive Nonlinear Least-Squares AlgorithmACM Transactions on Mathematical Software, 1981
- Algorithm 573: NL2SOL—An Adaptive Nonlinear Least-Squares Algorithm [E4]ACM Transactions on Mathematical Software, 1981
- Statistical predictor identificationAnnals of the Institute of Statistical Mathematics, 1970