Extremes of Information Combining
- 4 April 2005
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 51 (4) , 1313-1325
- https://doi.org/10.1109/tit.2005.844077
Abstract
Extreme densities for information combining are found for two important channel models: the binary-input symmetric parallel broadcast channel and the parity-constrained-input symmetric parallel channels. Following, upper and lower mutual information thresholds are stated for per-bit maximum a posteriori probability (MAP) decoding and low-density parity-check (LDPC) code ensembles.Keywords
This publication has 22 references indexed in Scilit:
- Density Evolution for Asymmetric Memoryless ChannelsIEEE Transactions on Information Theory, 2005
- Extrinsic information transfer functions: model and erasure channel propertiesIEEE Transactions on Information Theory, 2004
- Exact Thresholds and Optimal Codes for the Binary-Symmetric Channel and Gallager's Decoding Algorithm AIEEE Transactions on Information Theory, 2004
- Parity-check density versus performance of binary linear block codes over memoryless symmetric channelsIEEE Transactions on Information Theory, 2003
- Upper bounds on the rate of LDPC codesIEEE Transactions on Information Theory, 2002
- Convergence behavior of iteratively decoded parallel concatenated codesIEEE Transactions on Communications, 2001
- Design of capacity-approaching irregular low-density parity-check codesIEEE Transactions on Information Theory, 2001
- The capacity of low-density parity-check codes under message-passing decodingIEEE Transactions on Information Theory, 2001
- Reliable communication under channel uncertaintyIEEE Transactions on Information Theory, 1998
- Optimal decoding of linear codes for minimizing symbol error rate (Corresp.)IEEE Transactions on Information Theory, 1974