Strong universal consistency of neural network classifiers
- 1 July 1993
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 39 (4) , 1146-1151
- https://doi.org/10.1109/18.243433
Abstract
In statistical pattern recognition, a classifier is called universally consistent if its error probability converges to the Bayes-risk as the size of the training data grows for all possible distributions of the random variable pair of the observation vector and its class. It is proven that if a one-layered neural network with properly chosen number of nodes is trained to minimize the empirical risk on the training data, then a universally consistent classifier results. It is shown that the exponent in the rate of convergence does not depend on the dimension if certain smoothness conditions on the distribution are satisfied. That is, this class of universally consistent classifiers does not suffer from the curse of dimensionality. A training algorithm is presented that finds the optimal set of parameters in polynomial time if the number of nodes and the space dimension is fixed and the amount of training data growsKeywords
This publication has 23 references indexed in Scilit:
- Universal approximation bounds for superpositions of a sigmoidal functionIEEE Transactions on Information Theory, 1993
- Connectionist nonparametric regression: Multilayer feedforward networks can learn arbitrary mappingsNeural Networks, 1990
- An equivalence theorem for L1 convergence of the kernel regression estimateJournal of Statistical Planning and Inference, 1989
- Back propagation fails to separate where perceptrons succeedIEEE Transactions on Circuits and Systems, 1989
- What Size Net Gives Valid Generalization?Neural Computation, 1989
- Automatic pattern recognition: a study of the probability of errorPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1988
- Consistent Window Estimation in Nonparametric RegressionThe Annals of Statistics, 1980
- Distribution-Free Consistency Results in Nonparametric Discrimination and Regression Function EstimationThe Annals of Statistics, 1980
- Asymptotically Efficient Solutions to the Classification ProblemThe Annals of Statistics, 1978
- Consistent Nonparametric RegressionThe Annals of Statistics, 1977