Relations between entropy and error probability
- 1 January 1994
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 40 (1) , 259-266
- https://doi.org/10.1109/18.272494
Abstract
The relation between the entropy of a discrete random variable and the minimum attainable probability of error made in guessing its value is examined. While Fano's inequality provides a tight lower bound on the error probability in terms of the entropy, the present authors derive a converse result-a tight upper bound on the minimal error probability in terms of the entropy. Both bounds are sharp, and can draw a relation, as well, between the error probability for the maximum a posteriori (MAP) rule, and the conditional entropy (equivocation), which is a useful uncertainty measure in several applications. Combining this relation and the classical channel coding theorem, the authors present a channel coding theorem for the equivocation which, unlike the channel coding theorem for error probability, is meaningful at all rates. This theorem is proved directly for DMCs, and from this proof it is further concluded that for R⩾C the equivocation achieves its minimal value of R-C at the rate of n1/2 where n is the block lengthKeywords
This publication has 7 references indexed in Scilit:
- Elements of Information TheoryPublished by Wiley ,2001
- Universal prediction of individual sequencesIEEE Transactions on Information Theory, 1992
- Relative InformationPublished by Springer Nature ,1990
- Probability of error, equivocation, and the Chernoff boundIEEE Transactions on Information Theory, 1970
- Inequalities between information measures and error probabilityJournal of the Franklin Institute, 1966
- A simple derivation of the coding theorem and some applicationsIEEE Transactions on Information Theory, 1965
- A new interpretation of information rateIEEE Transactions on Information Theory, 1956