Coding for computing
Top Cited Papers
- 1 March 2001
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 47 (3) , 903-917
- https://doi.org/10.1109/18.915643
Abstract
A sender communicates with a receiver who wishes to reliably evaluate a function of their combined data. We show that if only the sender can transmit, the number of bits required is a conditional entropy of a naturally defined graph. We also determine the number of bits needed when the communicators exchange two messages.Keywords
This publication has 17 references indexed in Scilit:
- The CEO problem [multiterminal source coding]IEEE Transactions on Information Theory, 1996
- Worst-case interactive communication. I. Two messages are almost optimalIEEE Transactions on Information Theory, 1990
- New Bounds for Perfect Hashing via Information TheoryEuropean Journal of Combinatorics, 1988
- Fredman–Komlós bounds and information theorySIAM Journal on Algebraic Discrete Methods, 1986
- On the Size of Separating Systems and Families of Perfect Hash FunctionsSIAM Journal on Algebraic Discrete Methods, 1984
- Rate-distortion for correlated sources with partially separated encodersIEEE Transactions on Information Theory, 1982
- Broadcast channels with confidential messagesIEEE Transactions on Information Theory, 1978
- The zero-error side information problem and chromatic numbers (Corresp.)IEEE Transactions on Information Theory, 1976
- The rate-distortion function for source coding with side information at the decoderIEEE Transactions on Information Theory, 1976
- Noiseless coding of correlated information sourcesIEEE Transactions on Information Theory, 1973