Nonparametric estimation via empirical risk minimization
- 1 May 1995
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 41 (3) , 677-687
- https://doi.org/10.1109/18.382014
Abstract
A general notion of universal consistency of nonparametric estimators is introduced that applies to regression estimation, conditional median estimation, curve fitting, pattern recognition, and learning concepts. General methods for proving consistency of estimators based on minimizing the empirical error are shown. In particular, distribution-free almost sure consistency of neural network estimates and generalized linear estimators is establishedKeywords
This publication has 54 references indexed in Scilit:
- Rates of convergence for minimum contrast estimatorsProbability Theory and Related Fields, 1993
- Consistency of multilayer perceptron regression estimatorsNeural Networks, 1993
- Connectionist nonparametric regression: Multilayer feedforward networks can learn arbitrary mappingsNeural Networks, 1990
- Some Asymptotic Results for Learning in Single Hidden-Layer Feedforward Network ModelsJournal of the American Statistical Association, 1989
- An equivalence theorem for L1 convergence of the kernel regression estimateJournal of Statistical Planning and Inference, 1989
- On the approximate realization of continuous mappings by neural networksNeural Networks, 1989
- Consistent Window Estimation in Nonparametric RegressionThe Annals of Statistics, 1980
- Distribution-Free Consistency Results in Nonparametric Discrimination and Regression Function EstimationThe Annals of Statistics, 1980
- On the density of families of setsJournal of Combinatorial Theory, Series A, 1972
- Series Estimation of a Probability Density FunctionTechnometrics, 1971