Interactive Data Compression
- 24 August 2005
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE)
- No. 02725428,p. 100-108
- https://doi.org/10.1109/sfcs.1984.715906
Abstract
Let X and Y be two random variables with probability distribution p(x,y), joint entropy H(X,Y) and conditional entropies H(X \ Y) and H(Y \ X) . Person P/sub x/ knows X and person P/sub Y/ knows Y. They communicate over a noiseless two-way channel so that both know X and Y. It is proved that, on the average, at least H(X \ Y) + H(Y \ X) bits must be exchanged and that H(X,Y) + 2 bits are sufficient. If p(x.y) > 0 for all (x.y), then at least H(X,Y) bits must be communicated on the average. However, if p (x,y) is uniform over its support set, the average number of bits needed is close to H(X \ Y) + H (Y \ X). Randomized protocols can reduce the amount of communication considerably but only when some probability of error is acceptable.Keywords
This publication has 6 references indexed in Scilit:
- Communication with secrecy constraintsPublished by Association for Computing Machinery (ACM) ,1984
- Some complexity questions related to distributive computing(Preliminary Report)Published by Association for Computing Machinery (ACM) ,1979
- Probabilistic computations: Toward a unified measure of complexityPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1977
- Noiseless coding of correlated information sourcesIEEE Transactions on Information Theory, 1973
- A Method for the Construction of Minimum-Redundancy CodesProceedings of the IRE, 1952
- A Mathematical Theory of CommunicationBell System Technical Journal, 1948