Multi-layer perceptron ensembles for increased performance and fault-tolerance in pattern recognition tasks
- 17 December 2002
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE)
- Vol. 5, 2901-2906
- https://doi.org/10.1109/icnn.1994.374692
Abstract
Multilayer perceptrons (MLPs) have proven to be an effective way to solve classification tasks. A major concern in their use is the difficulty to define the proper network for a specific application, due to the sensitivity to the initial conditions and to overfitting and underfitting problems which limit their generalization capability. Moreover, time and hardware constraints may seriously reduce the degrees of freedom in the search for a single optimal network. A very promising way to partially overcome such drawbacks is the use of MLP ensembles: averaging and voting techniques are largely used in classical statistical pattern recognition and can be fruitfully applied to MLP classifiers. This work summarizes our experience in this field. A real-world OCR task is used as a test case to compare different models.<>Keywords
This publication has 7 references indexed in Scilit:
- Self-organization of architecture by simulated hierarchical adaptive random partitioningPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2003
- Stacked generalizationNeural Networks, 1992
- Methods of combining multiple classifiers and their applications to handwriting recognitionIEEE Transactions on Systems, Man, and Cybernetics, 1992
- Adaptive Mixtures of Local ExpertsNeural Computation, 1991
- An analysis on the performance of silicon implementations of backpropagation algorithms for artificial neural networksIEEE Transactions on Computers, 1991
- Neural network classification: a Bayesian interpretationIEEE Transactions on Neural Networks, 1990
- Neural network ensemblesPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1990