Fixed-rate universal lossy source coding and rates of convergence for memoryless sources

Abstract
A fixed-rate universal lossy coding scheme is introduced for independent and identically distributed (i.i.d.) sources, It is shown for finite alphabet sources and arbitrary single letter distortion measures that as the sample size n grows the expected distortion obtained using this universal scheme converges to Shannon's distortion rate function D(R) at a rate O (log n/n). The scheme can be extended to universal quantization of real i.i.d sources subject to a squared error criterion, It is shown in this case that the per-letter distortion converges to D(R) at a rate O(root log n/n) both in expectation and almost surely for any real-valued bounded i.i.d. source.

This publication has 21 references indexed in Scilit: