Huffman codes and self-information

Abstract
In this paper the connection between the self-information of a source letter from a finite alphabet and its code-word length in a Huffman code is investigated. Consider the set of all independent finite alphabet sources which contain a source letter a of probabilityp. The maximum over this set of the length of a Huffman codeword for a is determined. This maximum remains constant aspvaries between the reciprocal values of two consecutive Fibonacci numbers. For the smallpthis maximum is approximately equal to\left[ \log_{2} \frac{1+ \sqrt{5}}{2} \right]^{-1} \approx 1.44times the self-information.

This publication has 0 references indexed in Scilit: