The rates of convergence of kernel regression estimates and classification rules
- 1 September 1986
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 32 (5) , 668-679
- https://doi.org/10.1109/tit.1986.1057226
Abstract
Both nonrecursive and recursive nonparametric regression estimates are studied. The rates of weak and strong convergence of kernel estimates, as well as corresponding multiple classification errors, are derived without assuming the existence of the density of the measurements. An application of the obtained results to a nonparametric Bayes predication is presented.Keywords
This publication has 16 references indexed in Scilit:
- Distribution-Free Pointwise Consistency of Kernel Regression EstimateThe Annals of Statistics, 1984
- Distribution-free consistency of a nonparametric kernel regression estimate and classificationIEEE Transactions on Information Theory, 1984
- Almost everywhere convergence of a recursive regression function estimate and classification (Corresp.)IEEE Transactions on Information Theory, 1984
- On the Almost Everywhere Convergence of Nonparametric Regression Function EstimatesThe Annals of Statistics, 1981
- The rate of convergence ofk_n-NN regression estimates and classification rules (Corresp.)IEEE Transactions on Information Theory, 1981
- Asymptotic mean integrated squared errors of some nonparametric density estimators (Corresp.)IEEE Transactions on Information Theory, 1981
- Optimal Rates of Convergence for Nonparametric EstimatorsThe Annals of Statistics, 1980
- A unified theory of regularly varying sequencesMathematische Zeitschrift, 1973
- Asymptotically optimal discriminant functions for pattern classificationIEEE Transactions on Information Theory, 1969
- Probability Inequalities for Sums of Bounded Random VariablesJournal of the American Statistical Association, 1963