Coding of sources with unknown statistics--I: Probability of encoding error
- 1 May 1972
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 18 (3) , 384-389
- https://doi.org/10.1109/tit.1972.1054830
Abstract
It is well known that it is often possible to obtain considerable data compression by encoding messages in long blocks. Usually the coding scheme for a specific source depends parametrically on the statistics of the source. Universal codes which are independent of the source statistics are introduced. These codes are shown to be asymptotically optimal in the sense that the probability of encoding error can be made vanishingly small for output rates no larger than those of optimal codes that do in fact depend on the statistics of the source. A particular universal coding scheme is introduced for which the encoding complexity increases no faster than the second power of the block lengthnand for which the encoding error vanishes exponentially withn. The discussion is limited to discrete-time finite-alphabet sources.Keywords
This publication has 3 references indexed in Scilit:
- Codes based on inaccurate source probabilitiesIEEE Transactions on Information Theory, 1971
- Necessary and sufficient conditions for the existence of the ε-Property (A.E.P.)Information and Control, 1968
- On the probability of error for communication in white Gaussian noiseIEEE Transactions on Information Theory, 1967