A general formula for channel capacity
- 1 July 1994
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 40 (4) , 1147-1157
- https://doi.org/10.1109/18.335960
Abstract
A formula for the capacity of arbitrary single-user channels without feedback (not necessarily information stable, stationary, etc.) is proved. Capacity is shown to equal the supremum, over all input processes, of the input-output inf-information rate defined as the liminf in probability of the normalized information density. The key to this result is a new converse approach based on a simple new lower bound on the error probability of m-ary hypothesis tests among equiprobable hypotheses. A necessary and sufficient condition for the validity of the strong converse is given, as well as general expressions for ε-capacityKeywords
This publication has 17 references indexed in Scilit:
- A lower bound on the probability of error in multihypothesis testingIEEE Transactions on Information Theory, 1995
- The source-channel separation theorem revisitedIEEE Transactions on Information Theory, 1995
- Approximation theory of output statisticsIEEE Transactions on Information Theory, 1993
- Entropy and Information TheoryPublished by Springer Nature ,1990
- A general formula for the capacity of stationary nonanticipatory channelsInformation and Control, 1974
- Broadcast channelsIEEE Transactions on Information Theory, 1972
- The capacity of the general time-discrete channel with finite alphabetInformation and Control, 1969
- The weak capacity of averaged channelsProbability Theory and Related Fields, 1968
- On channels without a capacityInformation and Control, 1963
- A Mathematical Theory of CommunicationBell System Technical Journal, 1948