Renyi's entropy and the probability of error
- 1 May 1978
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 24 (3) , 324-331
- https://doi.org/10.1109/tit.1978.1055890
Abstract
The basic properties of Renyi's entropy are reviewed, and its concavity properties are characterized. New bounds (referred to asI_{\alpha}bounds) on the probability of error are derived from Renyi's entropy and are compared with known bounds. It is proved that for the two-class case, theI_{2}bound is sharper than many of the previously known bounds. The difference between theI_{2}bound and the real value of the probability of error is at most 0.09.Keywords
This publication has 13 references indexed in Scilit:
- Error estimation in pattern recognition viaL_alpha-distance between posterior density functionsIEEE Transactions on Information Theory, 1976
- Patterns in pattern recognition: 1968-1974IEEE Transactions on Information Theory, 1974
- On a New Class of Bounds on Bayes Risk in Multihypothesis Pattern RecognitionIEEE Transactions on Computers, 1974
- Probability of error, equivocation, and the Chernoff boundIEEE Transactions on Information Theory, 1970
- A class of upper bounds on probability of error for multihypotheses pattern recognition (Corresp.)IEEE Transactions on Information Theory, 1969
- Generalization of Hölder's and Minkowski's inequalitiesMathematical Proceedings of the Cambridge Philosophical Society, 1968
- Nearest neighbor pattern classificationIEEE Transactions on Information Theory, 1967
- Pseudo-Convex FunctionsJournal of the Society for Industrial and Applied Mathematics Series A Control, 1965
- A Measure of Asymptotic Efficiency for Tests of a Hypothesis Based on the sum of ObservationsThe Annals of Mathematical Statistics, 1952
- A Mathematical Theory of CommunicationBell System Technical Journal, 1948