Another look at the coding theorem of information theory—A tutorial
- 1 June 1970
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in Proceedings of the IEEE
- Vol. 58 (6) , 894-913
- https://doi.org/10.1109/proc.1970.7797
Abstract
In this tutorial paper we consider the problem of the transmission of data from a general class of information sources over a general class of communication channels. The problem is the determination of the maximum attainable value of a (suitably defined) "reliability." In general the channel imposes limits on the attainable reliability in three ways: 1) by introducing "noise" into the system, 2) because of a "mismatch" between source and channel (for example, an analog data source and digital communication channel), and 3) because of "costs" associated with various channel inputs (for example, signal "power"). We assume that the system designer is allowed to interpose data processors between the source and channel input; and between the channel output and the user (called "encoder" and "decoder," respectively) to combat these limitations. Shannon's coding theorem, which is the subject of this paper, gives an answer to this question of maximum reliability in the special case where no limit is imposed on the complexity of these processors. Since this is a tutorial paper, we emphasize motivating material and discussion at the expense of mathematical details and proofs.Keywords
This publication has 1 reference indexed in Scilit:
- A Mathematical Theory of CommunicationBell System Technical Journal, 1948