Concentration inequalities using the entropy method
Open Access
- 1 July 2003
- journal article
- Published by Institute of Mathematical Statistics in The Annals of Probability
- Vol. 31 (3) , 1583-1614
- https://doi.org/10.1214/aop/1055425791
Abstract
We investigate a new methodology, worked out by Ledoux and Massart, to prove concentration-of-measure inequalities. The method is based on certain modified logarithmic Sobolev inequalities. We provide some very simple and general ready-to-use inequalities. One of these inequalities may be considered as an exponential version of the Efron--Stein inequality. The main purpose of this paper is to point out the simplicity and the generality of the approach. We show how the new method can recover many of Talagrand's revolutionary inequalities and provide new applications in a variety of problems including Rademacher averages, Rademacher chaos, the number of certain small subgraphs in a random graph, and the minimum of the empirical risk in some statistical estimation problems.Keywords
This publication has 33 references indexed in Scilit:
- The infamous upper tailRandom Structures & Algorithms, 2002
- Empirical Margin Distributions and Bounding the Generalization Error of Combined ClassifiersThe Annals of Statistics, 2002
- Some applications of concentration inequalities to statisticsAnnales de la Faculté des sciences de Toulouse : Mathématiques, 2000
- Majorizing measures: the generic chainingThe Annals of Probability, 1996
- Isoperimetry and Gaussian analysisLecture Notes in Mathematics, 1996
- Poisson approximation for large deviationsRandom Structures & Algorithms, 1990
- An Efron-Stein Inequality for Nonsymmetric StatisticsThe Annals of Statistics, 1986
- The Jackknife Estimate of VarianceThe Annals of Statistics, 1981
- Correction to bounds on conditional probabilities with applicationsProbability Theory and Related Fields, 1977
- On general minimax theoremsPacific Journal of Mathematics, 1958