Entropy and data compression schemes
- 1 January 1993
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 39 (1) , 78-83
- https://doi.org/10.1109/18.179344
Abstract
Some new ways of defining the entropy of a process by observing a single typical output sequence as well as a new kind of Shannon-McMillan-Breiman theorem are presented. This provides a new and conceptually very simple ways of estimating the entropy of an ergodic stationary source as well as new insight into the workings of such well-known data compression schemes as the Lempel-Ziv algorithm.<>Keywords
This publication has 7 references indexed in Scilit:
- How Sampling Reveals a ProcessThe Annals of Probability, 1990
- Some asymptotic properties of the entropy of a stationary ergodic data source with applications to data compressionIEEE Transactions on Information Theory, 1989
- Entropy and isomorphism theorems for actions of amenable groupsJournal d'Analyse Mathématique, 1987
- The Shannon-McMillan-Breiman theorem for a class of amenable groupsIsrael Journal of Mathematics, 1983
- Compression of individual sequences via variable-rate codingIEEE Transactions on Information Theory, 1978
- Coding theorems for individual sequencesIEEE Transactions on Information Theory, 1978
- A universal algorithm for sequential data compressionIEEE Transactions on Information Theory, 1977