Better Digit Recognition with a Committee of Simple Neural Nets
- 1 September 2011
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE)
- No. 15205363,p. 1250-1254
- https://doi.org/10.1109/icdar.2011.252
Abstract
We present a new method to train the members of a committee of one-hidden-layer neural nets. Instead of training various nets on subsets of the training data we preprocess the training data for each individual model such that the corresponding errors are decor related. On the MNIST digit recognition benchmark set we obtain a recognition error rate of 0.39%, using a committee of 25 one-hidden-layer neural nets, which is on par with state-of-the-art recognition rates of more complicated systems.Keywords
This publication has 16 references indexed in Scilit:
- Combining Multiple Classifiers for Faster Optical Character RecognitionPublished by Springer Nature ,2006
- Object Recognition with Features Inspired by Visual CortexPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2005
- Best practices for convolutional neural networks applied to visual document analysisPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2005
- Towards stochastic conjugate gradient methodsPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2004
- The combining classifier: to train or not to train?Published by Institute of Electrical and Electronics Engineers (IEEE) ,2003
- A theoretical study on six classifier fusion strategiesPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Optimal linear combination of neural networks for improving classification performancePublished by Institute of Electrical and Electronics Engineers (IEEE) ,2000
- On combining classifiersPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1998
- Bagging predictorsMachine Learning, 1996
- Improving model accuracy using optimal linear combinations of trained neural networksIEEE Transactions on Neural Networks, 1995