Square root penalty: Adaptation to the margin in classification and in edge estimation
Open Access
- 1 June 2005
- journal article
- Published by Institute of Mathematical Statistics in The Annals of Statistics
- Vol. 33 (3) , 1203-1224
- https://doi.org/10.1214/009053604000001066
Abstract
We consider the problem of adaptation to the margin in binary classification. We suggest a penalized empirical risk minimization classifier that adaptively attains, up to a logarithmic factor, fast optimal rates of convergence for the excess risk, that is, rates that can be faster than n^{-1/2}, where n is the sample size. We show that our method also gives adaptive estimators for the problem of edge estimation.Keywords
All Related Versions
This publication has 16 references indexed in Scilit:
- Reflection implies the SCHFundamenta Mathematicae, 2008
- Local Rademacher complexities and oracle inequalities in risk minimizationThe Annals of Statistics, 2006
- Convexity, Classification, and Risk BoundsJournal of the American Statistical Association, 2006
- Aggregated estimators and empirical complexity for least square regressionAnnales de l'Institut Henri Poincaré, Probabilités et Statistiques, 2004
- Complexity regularization via localized random penaltiesThe Annals of Statistics, 2004
- Optimal aggregation of classifiers in statistical learningThe Annals of Statistics, 2004
- Adaptive estimation with soft thresholding penaltiesStatistica Neerlandica, 2002
- Empirical Margin Distributions and Bounding the Generalization Error of Combined ClassifiersThe Annals of Statistics, 2002
- Rademacher penalties and structural risk minimizationIEEE Transactions on Information Theory, 2001
- Smooth discrimination analysisThe Annals of Statistics, 1999