Stationary coding of processes
- 1 May 1979
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 25 (3) , 283-291
- https://doi.org/10.1109/tit.1979.1056037
Abstract
In 1969, Ornstein showed that two independent, identically distributed (i.i.d.) processes with the same entropy are isomorphic, that is, there is a stationary, invertible code from one onto the other. These results have led to a number of characterizations of the class of processes which are stationary codings of i.i.d, processes. Many of these ideas are now finding a use in information theory. Some of these results and ideas are discussed, using the process language familiar to information theorists.Keywords
This publication has 14 references indexed in Scilit:
- Channels with almost finite memoryIEEE Transactions on Information Theory, 1979
- Block coding for discrete stationaryd-continuous noisy channelsIEEE Transactions on Information Theory, 1979
- Fixed-rate universal codes for Markov sourcesIEEE Transactions on Information Theory, 1978
- Block and sliding-block source codingIEEE Transactions on Information Theory, 1977
- Theđ-distance between two Markov processes cannot always be attained by a Markov joiningIsrael Journal of Mathematics, 1976
- Sliding-block joint source/noisy-channel coding theoremsIEEE Transactions on Information Theory, 1976
- Sliding-block source codingIEEE Transactions on Information Theory, 1975
- Nonblock Source Coding with a Fidelity CriterionThe Annals of Probability, 1975
- A Generalization of Ornstein's $\bar d$ Distance with Applications to Information TheoryThe Annals of Probability, 1975
- Finitely determined implies very weak BernoulliIsrael Journal of Mathematics, 1974