INFORMATION CONTENT IN UNIFORMLY DISCRETIZED GAUSSIAN NOISE: OPTIMAL COMPRESSION RATES
- 1 June 1999
- journal article
- Published by World Scientific Pub Co Pte Ltd in International Journal of Modern Physics C
- Vol. 10 (4) , 687-716
- https://doi.org/10.1142/s0129183199000528
Abstract
We approach the theoretical problem of compressing a signal dominated by Gaussian noise. We present expressions for the compression ratio which can be reached, under the light of Shannon's noiseless coding theorem, for a linearly quantized stochastic Gaussian signal (noise). The compression ratio decreases logarithmically with the amplitude of the frequency spectrum P(f) of the noise. Entropy values and compression rates are shown to depend on the shape of this power spectrum, given different normalizations. The cases of white noise (w.n.), fnp power-law noise (including 1/f noise), ( w.n. +1/f) noise, and piecewise ( w.n. +1/f | w.n. +1/f2) noise are discussed, while quantitative behaviors and useful approximations are provided.Keywords
All Related Versions
This publication has 7 references indexed in Scilit:
- 10-GHz Tenerife cosmic microwave background observations at 8o resolution and their analysis using a new maximum entropy methodMonthly Notices of the Royal Astronomical Society, 1998
- Testing for nonlinearity using redundancies: quantitative and qualitative aspectsPhysica D: Nonlinear Phenomena, 1995
- Testing for nonlinearity in time series: the method of surrogate dataPhysica D: Nonlinear Phenomena, 1992
- A VLSI chip set for high-speed lossless data compressionIEEE Transactions on Circuits and Systems for Video Technology, 1992
- International digital facsimile coding standardsProceedings of the IEEE, 1980
- Variations on a theme by HuffmanIEEE Transactions on Information Theory, 1978
- A Mathematical Theory of CommunicationBell System Technical Journal, 1948