INFORMATION CONTENT IN UNIFORMLY DISCRETIZED GAUSSIAN NOISE: OPTIMAL COMPRESSION RATES

Abstract
We approach the theoretical problem of compressing a signal dominated by Gaussian noise. We present expressions for the compression ratio which can be reached, under the light of Shannon's noiseless coding theorem, for a linearly quantized stochastic Gaussian signal (noise). The compression ratio decreases logarithmically with the amplitude of the frequency spectrum P(f) of the noise. Entropy values and compression rates are shown to depend on the shape of this power spectrum, given different normalizations. The cases of white noise (w.n.), fnp power-law noise (including 1/f noise), ( w.n. +1/f) noise, and piecewise ( w.n. +1/f | w.n. +1/f2) noise are discussed, while quantitative behaviors and useful approximations are provided.

This publication has 7 references indexed in Scilit: