Information rates of autoregressive processes
- 1 July 1970
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 16 (4) , 412-421
- https://doi.org/10.1109/tit.1970.1054470
Abstract
The rate distortion functionR(D)is calculated for two time-discrete autoregressive sources--the time-discrete Gaussian autoregressive source with a mean-square-error fidelity criterion and the binary-symmetric first-order Markov source with an average probability-of-error per bit fidelity criterion. In both cases it is shown thatR(D)is bounded below by the rate distortion function of the independent-letter identically distributed sequence that generates the autoregressive source. This lower bound is shown to hold with equality for a nonzero region of small average distortion. The positive coding theorem is proved for the possibly nonstationary Gaussian autoregressive source with a constraint on the parameters. Finally, it is shown that the rate distortion function of any time-discrete autoregressive source with a difference distortion measure can be bounded below by the rate distortion function of the independent-letter identically distributed generating sequence with the same distortion measure.Keywords
This publication has 5 references indexed in Scilit:
- Information rates of Wiener processesIEEE Transactions on Information Theory, 1970
- A coding theorem for time-discrete analog data sourcesIEEE Transactions on Information Theory, 1969
- Rate distortion theory for sources with abstract alphabets and memoryInformation and Control, 1968
- Toeplitz Forms and Their ApplicationsPhysics Today, 1958
- On the Shannon theory of information transmission in the case of continuous signalsIRE Transactions on Information Theory, 1956