The random coding bound is tight for the average code (Corresp.)
- 1 March 1973
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 19 (2) , 244-246
- https://doi.org/10.1109/tit.1973.1054971
Abstract
The random coding bound of information theory provides a well-known upper bound to the probability of decoding error for the best code of a given rate and block length. The bound is constructed by upper-bounding the average error probability over an ensemble of codes. The bound is known to give the correct exponential dependence of error probability on block length for transmission rates above the critical rate, but it gives an incorrect exponential dependence at rates below a second lower critical rate. Here we derive an asymptotic expression for the average error probability over the ensemble of codes used in the random coding bound. The result shows that the weakness of the random coding bound at rates below the second critical rate is due not to upperbounding the ensemble average, but rather to the fact that the best codes are much better than the average at low rates.Keywords
This publication has 2 references indexed in Scilit:
- Error estimates for low rate codesProbability Theory and Related Fields, 1969
- Lower bounds to error probability for coding on discrete memoryless channels. IInformation and Control, 1967