Good codes can be produced by a few permutations
- 1 May 1982
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 28 (3) , 430-443
- https://doi.org/10.1109/tit.1982.1056514
Abstract
Our main result is that good codes, even those meeting the random coding bound, can be produced with relatively few (linear in the block length) permutations from a single codeword. This cutdown in complexity may be of practical importance. The motivation for looking at such codes came from Ahlswede's covering lemma, which makes it possible to build correlated source codes from channel codes via permutations. In Appendix I we show that the problem of finding the best error exponents for coding sources with full side information at the decoder, which has received attention in the recent literature, can easily be reduced to the familiar one for the discrete memoryless channel (DMC). Finally, in Appendices II and III we give rather precise double exponentially small bounds on the probabilities that a randomly chosen code will fail to meet the random coding or expurgated bound for the DMC. According to these results, good codes are hard to miss if selected at random. This also explains why good codes of a Iow complexity (such as those produced byKeywords
This publication has 15 references indexed in Scilit:
- Graph decomposition: A new key to coding theoremsIEEE Transactions on Information Theory, 1981
- Composition bounds for channel block codesIEEE Transactions on Information Theory, 1977
- A lower bounding method for channel and source coding probabilitiesInformation and Control, 1975
- Hypothesis testing and information theoryIEEE Transactions on Information Theory, 1974
- Channel capacities for list codesJournal of Applied Probability, 1973
- Noiseless coding of correlated information sourcesIEEE Transactions on Information Theory, 1973
- On the converse to the coding theorem for discrete memoryless channels (Corresp.)IEEE Transactions on Information Theory, 1973
- Lower bounds to error probability for coding on discrete memoryless channels. IInformation and Control, 1967
- A simple derivation of the coding theorem and some applicationsIEEE Transactions on Information Theory, 1965
- Certain results in coding theory for noisy channelsInformation and Control, 1957