On the capacity of Markov sources over noisy channels
- 13 November 2002
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE)
- Vol. 5, 2997-3001
- https://doi.org/10.1109/glocom.2001.965977
Abstract
We present an expectation-maximization method for optimizing Markov process transition probabilities to increase the mutual information rate achievable when the Markov process is transmitted over a noisy finite-state machine channel. The method provides a tight lower bound on the achievable information rate of a Markov process over a noisy channel and it is conjectured that it actually maximizes this information rate. The latter statement is supported by empirical evidence (not shown in this paper) obtained through brute-force optimization methods on low-order Markov processes. The proposed expectation-maximization procedure can be used to find tight lower bounds on the capacities of finite-state machine channels (say, partial response channels) or the noisy capacities of constrained (say, run-length limited) sequences, with the bounds becoming arbitrarily tight as the memory-length of the input Markov process approaches infinity. The method links the Arimoto-Blahut algorithm to Shannon's noise-free entropy maximization by introducing the noisy adjacency matrix.Keywords
This publication has 11 references indexed in Scilit:
- On the achievable information rates of finite state ISI channelsPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- On the information rate of binary-input channels with memoryPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Codes for digital recordersIEEE Transactions on Information Theory, 1998
- The intersymbol interference channel: lower bounds on capacity and channel precoding lossIEEE Transactions on Information Theory, 1996
- Coding for channels with cost constraintsIEEE Transactions on Information Theory, 1996
- Information rates for a discrete-time Gaussian channel with intersymbol interference and stationary inputsIEEE Transactions on Information Theory, 1991
- On runlength codesIEEE Transactions on Information Theory, 1988
- Optimal decoding of linear codes for minimizing symbol error rate (Corresp.)IEEE Transactions on Information Theory, 1974
- Computation of channel capacity and rate-distortion functionsIEEE Transactions on Information Theory, 1972
- A Mathematical Theory of CommunicationBell System Technical Journal, 1948