Hierarchical universal coding
- 1 January 1996
- journal article
- research article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 42 (5) , 1354-1364
- https://doi.org/10.1109/18.532877
Abstract
In an earlier paper, me proved a strong version of the redundancy-capacity converse theorem of universal coding, stating that for ''most'' sources in a given class, the universal coding redundancy is essentially lower-bounded by the capacity of the channel induced by this class. Since this result holds for general classes of sources, it extends Rissanen's strong converse theorem for parametric families. While our earlier result has established strong optimality only for mixture codes weighted by the capacity-achieving prior, our first result herein extends this finding to a general prior. For some cases our technique also leads to a simplified proof of the above mentioned strong converse theorem. The major interest in this paper, however, is in extending the theory of universal coding to hierarchical structures of classes, where each class may have a different capacity. In this setting, one wishes to incur redundancy essentially as small as that corresponding to the active class, and not the union of classes. Our main result is that the redundancy of a code based on a two-stage mixture (first, within each class, and then over the classes), is no worse than that of any other code for ''most'' sources of ''most'' classes. If,in addition, the classes can be efficiently distinguished by a certain decision rule, then the best: attainable redundancy is given explicitly by the capacity of the active class plus the normalized negative logarithm of the prior probability assigned to this class. These results suggest some interesting guidelines as for the choice of the prior. We also discuss some examples with a natural hierarchical partition into classes.Keywords
This publication has 17 references indexed in Scilit:
- A strong version of the redundancy-capacity theorem of universal codingIEEE Transactions on Information Theory, 1995
- Jeffreys' prior is asymptotically least favorable under entropy riskJournal of Statistical Planning and Inference, 1994
- Approximation of Density Functions by Sequences of Exponential FamiliesThe Annals of Statistics, 1991
- On the competitive optimality of Huffman codesIEEE Transactions on Information Theory, 1991
- Information-theoretic asymptotics of Bayes methodsIEEE Transactions on Information Theory, 1990
- Stochastic Complexity and ModelingThe Annals of Statistics, 1986
- Universal coding, information, prediction, and estimationIEEE Transactions on Information Theory, 1984
- A universal data compression systemIEEE Transactions on Information Theory, 1983
- Minimax noiseless universal coding for Markov sourcesIEEE Transactions on Information Theory, 1983
- A source matching approach to finding minimax codesIEEE Transactions on Information Theory, 1980