Superscalar Huffman decoder hardware design
- 1 May 1994
- proceedings article
- Published by SPIE-Intl Soc Optical Eng
Abstract
Huffman coding is one of the most common forms of lossless data compression. Many lossy image compression standards, for example the MPEG and JPEG, use Huffman coding as the back end entropy compressor because of its relatively good compression performance and simple hardware implementation. However, the decoding speed is limited by a feedback loop. For applications that require high speed decoding, such as High Definition Television at about 100 Mbyte/s, this feedback loop can be prohibitively slow. The highest speed conventional `parallel' Huffman decoders decode one complete codeword per look-up table memory cycle. This paper describes three different hardware designs that break through this limit. All three depend on probabilistic modeling of the coded data stream to predict, or speculate, on the values of adjacent codewords. One design uses a single fully specified memory with enough width for two, or more, output tokens. The other two designs use multiple memories each fed by a different portion of the code stream. This superscalar approach leads to average decode rates twice, or more, that of a conventional `parallel' decoder for a simulation of JPEG Huffman token data. The relative performance versus hardware cost is described for each design.Keywords
This publication has 0 references indexed in Scilit: