A conditional entropy bound for a pair of discrete random variables
- 1 September 1975
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 21 (5) , 493-501
- https://doi.org/10.1109/tit.1975.1055437
Abstract
LetX, Ybe a pair of discrete random variables with a given joint probability distribution. For0 \leq x \leq H(X), the entropy ofX, define the functionF(x)as the infimum ofH(Y\mid W), the conditional entropy ofYgivenW, with respect to all discrete random variablesWsuch that a)H(X\mid W) = x, and b)WandYare conditionally independent givenX. This paper concerns the functionF, its properties, its calculation, and its applications to several problems in information theory.Keywords
This publication has 8 references indexed in Scilit:
- On source coding with side information at the decoderIEEE Transactions on Information Theory, 1975
- On Sequences of Pairs of Dependent Random VariablesSIAM Journal on Applied Mathematics, 1975
- Entropy inequalities for discrete channelsIEEE Transactions on Information Theory, 1974
- A theorem on the entropy of certain binary sequences and applications--IIEEE Transactions on Information Theory, 1973
- Random coding theorem for broadcast channels with degraded componentsIEEE Transactions on Information Theory, 1973
- Convex AnalysisPublished by Walter de Gruyter GmbH ,1970
- Convexity and Optimization in Finite Dimensions IPublished by Springer Nature ,1970
- Theorie der Konvexen KörperPublished by Springer Nature ,1934