Generalizing the Fano inequality
- 1 July 1994
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 40 (4) , 1247-1251
- https://doi.org/10.1109/18.335943
Abstract
The Fano inequality gives a lower bound on the mutual information between two random variables that take values on an M-element set, provided at least one of the random variables is equiprobable. The authors show several simple lower bounds on mutual information which do not assume such a restriction. In particular, this ran be accomplished by replacing log M with the infinite-order Renyi entropy in the Fano inequality. Applications to hypothesis testing are exhibited along with bounds on mutual information in terms of the a priori and a posteriori error probabilitiesKeywords
This publication has 2 references indexed in Scilit:
- Relations between entropy and error probabilityIEEE Transactions on Information Theory, 1994
- Information bounds of the Fano-Kullback typeIEEE Transactions on Information Theory, 1976