The capacity of the arbitrarily varying channel revisited: positivity, constraints
- 1 March 1988
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 34 (2) , 181-193
- https://doi.org/10.1109/18.2627
Abstract
A well-known result of R. Ahlswede (1970) asserts that the deterministic code capacity of an arbitrarily varying channel (AVC), under the average-error-probability criterion, either equals its random code capacity or else is zero. A necessary and sufficient condition is identified for deciding between these alternative, namely, the capacity is zero if and only if the AVC is symmetrizable. The capacity of the AVC is determined with constraints on the transmitted codewords as well as on the channel state sequences, and it is demonstrated that it may be positive but less than the corresponding random code capacity. A special case of the results resolves a weakened version of a fundamental problem of coding theoryKeywords
This publication has 8 references indexed in Scilit:
- Arbitrarily varying channels with constrained inputs and statesIEEE Transactions on Information Theory, 1988
- Exponential error bounds for random codes in the arbitrarily varying channelIEEE Transactions on Information Theory, 1985
- On the capacity of the arbitrarily varying channel for maximum probability of errorProbability Theory and Related Fields, 1981
- Elimination of correlation in random codes for arbitrarily varying channelsProbability Theory and Related Fields, 1978
- Coding Theorems of Information TheoryPublished by Springer Nature ,1978
- A Note on the Existence of the Weak Capacity for Channels with Arbitrarily Varying Channel Probability Functions and Its Relation to Shannon's Zero Error CapacityThe Annals of Mathematical Statistics, 1970
- The Capacities of Certain Channel Classes Under Random CodingThe Annals of Mathematical Statistics, 1960
- The zero error capacity of a noisy channelIEEE Transactions on Information Theory, 1956