On the information rate of binary-input channels with memory
Top Cited Papers
- 13 November 2002
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE)
- Vol. 9, 2692-2695
- https://doi.org/10.1109/icc.2001.936639
Abstract
The entropy rate of a finite-state hidden Markov model can be estimated by forward sum-product trellis processing (i.e., the forward recursion of the Baum-Welch/BCJR algorithm) of simulated model output data. This can be used to compute information rates of binary-input AWGN channels with memory.Keywords
This publication has 6 references indexed in Scilit:
- Elements of Information TheoryPublished by Wiley ,2001
- The intersymbol interference channel: lower bounds on capacity and channel precoding lossIEEE Transactions on Information Theory, 1996
- Maximum-likelihood estimation for hidden Markov modelsStochastic Processes and their Applications, 1992
- Information rates for a discrete-time Gaussian channel with intersymbol interference and stationary inputsIEEE Transactions on Information Theory, 1991
- Optimal decoding of linear codes for minimizing symbol error rate (Corresp.)IEEE Transactions on Information Theory, 1974
- Statistical Inference for Probabilistic Functions of Finite State Markov ChainsThe Annals of Mathematical Statistics, 1966