Approximation theory of output statistics
- 1 May 1993
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 39 (3) , 752-772
- https://doi.org/10.1109/18.256486
Abstract
Given a channel and an input process with output statistics that approximate the original output statistics with arbitrary accuracy, the randomness of the input processes is studied. The notion of resolvability of a channel, defined as the number of random bits required per channel use in order to generate an input that achieves arbitrarily accurate approximation of the output statistics for any given input process, is introduced. A general formula for resolvability that holds regardless of the channel memory structure is obtained. It is shown that for most channels, resolvability is equal to the Shannon capacity. By-products of the analysis are a general formula for the minimum achievable source coding rate of any finite-alphabet source and a strong converse of the identification coding theorem, which holds for any channel that satisfies the strong converse of the channel coding theorem.Keywords
This publication has 13 references indexed in Scilit:
- Elements of Information TheoryPublished by Wiley ,2001
- New results in the theory of identification via channelsIEEE Transactions on Information Theory, 1992
- Multiple-access channels with memory with and without frame synchronismIEEE Transactions on Information Theory, 1989
- Finite-state adaptive block to variable-length noiseless coding of a nonstationary information sourceIEEE Transactions on Information Theory, 1989
- Channel Entropy and Primitive ApproximationThe Annals of Probability, 1982
- The ergodic decomposition of stationary discrete random processesIEEE Transactions on Information Theory, 1974
- On channels without a capacityInformation and Control, 1963
- A note on the strong converse of the coding theorem for the general discrete finite-memory channelInformation and Control, 1960
- On the coding theorem and its converse for finite-memory channelsInformation and Control, 1959
- A new basic theorem of information theoryTransactions of the IRE Professional Group on Information Theory, 1954