A conditional entropy bound for a pair of discrete random variables

Abstract
LetX, Ybe a pair of discrete random variables with a given joint probability distribution. For0 \leq x \leq H(X), the entropy ofX, define the functionF(x)as the infimum ofH(Y\mid W), the conditional entropy ofYgivenW, with respect to all discrete random variablesWsuch that a)H(X\mid W) = x, and b)WandYare conditionally independent givenX. This paper concerns the functionF, its properties, its calculation, and its applications to several problems in information theory.

This publication has 8 references indexed in Scilit: