On the achievable information rates of finite state ISI channels
- 13 November 2002
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE)
- p. 2992-2996
- https://doi.org/10.1109/glocom.2001.965976
Abstract
In this paper, we present two simple Monte Carlo methods for estimating the achievable information rates of general finite state channels. Both methods require only the ability to sim- ulate the channel with an a posteriori probability (APP) detector matched to the channel. The first method estimates the mutual in- formation rate between the input random process and the output random process, provided that both processes are stationary and ergodic. When the inputs are i.i.d. equiprobable, this rate is known as the Symmetric Information Rate (SIR). The second method esti- mates the achievable information rate of an explicit coding system which interleaves m independent codes onto the channel and em- ploys multistage decoding. For practical values of m, numerical results show that this system nearly achieves the SIR. Both meth- ods are applied to the class of partial response channels commonly used in magnetic recording.Keywords
This publication has 8 references indexed in Scilit:
- On the information rate of binary-input channels with memoryPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Entropy and channel capacity in the regenerative setup with applications to Markov channelsPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Elements of Information TheoryPublished by Wiley ,2001
- Design of capacity-approaching irregular low-density parity-check codesIEEE Transactions on Information Theory, 2001
- Limit theorems for functionals of ergodic Markov chains with general state spaceMemoirs of the American Mathematical Society, 1999
- Capacity, mutual information, and coding for finite-state Markov channelsIEEE Transactions on Information Theory, 1996
- Information rates for a discrete-time Gaussian channel with intersymbol interference and stationary inputsIEEE Transactions on Information Theory, 1991
- The Strong Ergodic Theorem for Densities: Generalized Shannon-McMillan-Breiman TheoremThe Annals of Probability, 1985