An application of informational divergence to Huffman codes
- 1 January 1982
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 28 (1) , 36-43
- https://doi.org/10.1109/tit.1982.1056452
Abstract
A classification of all probability distributions over the finite alphabet of an information source is given, where the classes are the sets of distributions sharing the same binary Huffman code. Such a classification can be used in noiseless coding, when the distribution of the finite memoryless source varies in time or becomes gradually known. Instead of applying the Huffman algorithm to each new estimate of the probability distribution, if a simple test based on the above classification is passed, then the Huffman code used previously is optimal also for the new distribution.Keywords
This publication has 5 references indexed in Scilit:
- Variations on a theme by HuffmanIEEE Transactions on Information Theory, 1978
- An informational divergence geometry for stochastic matricesCalcolo, 1978
- Huffman codes and self-informationIEEE Transactions on Information Theory, 1976
- Codes based on inaccurate source probabilitiesIEEE Transactions on Information Theory, 1971
- A Method for the Construction of Minimum-Redundancy CodesProceedings of the IRE, 1952