Quantizing for maximum output entropy (Corresp.)
- 1 September 1971
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 17 (5) , 612
- https://doi.org/10.1109/tit.1971.1054681
Abstract
The entropy at the output of a quantizer is equal to the average mutual information between unquantized and quantized random variables. Thus, for a fixed number of quantization levels, output entropy is a reasonable information-theoretic criterion of quantizer fidelity. It is shown that, for a class of signal distributions, which includes the Gaussian, the quantizers with maximum output entropy (MOE) and minimum average error (MAE) are approximately the same within a multiplicative constant.Keywords
This publication has 6 references indexed in Scilit:
- On optimum quantizationIEEE Transactions on Information Theory, 1969
- Asymptotically efficient quantizingIEEE Transactions on Information Theory, 1968
- Useful Approximations to Optimum QuantizationIEEE Transactions on Communications, 1966
- Quantizing for minimum distortion (Corresp.)IEEE Transactions on Information Theory, 1964
- Quantizing for minimum distortionIEEE Transactions on Information Theory, 1960
- Instantaneous companding of quantized signalsBell System Technical Journal, 1957