Information content in Gaussian noise: optimal compression rates
Abstract
We approach the theoretical problem of compressing a signal dominated by gaussian noise. We present accurate expressions for the compression ratio which can be reached under the light of Shannon's noiseless coding theorem, for a linearly quantized stochastic gaussian signal (noise). The compression ratio decreases logarithmically with the amplitude of the frequency spectrum of the noise $P(f)$. Further, we show how the entropy and the compression rate depend on the shape of this power spectrum, given different normalizations. The cases of white noise ($w.n.$), power-law noise $f^{n_p}$ --including $1/f$ noise--, ($w.n.+ 1/f$) noise, and piecewise ($w.n.+ 1/f +1/f^2$) noise are discussed in detail, while quantitative behaviours and useful approximations are provided.
Keywords
All Related Versions
This publication has 0 references indexed in Scilit: