Abstract
Digital video is often perceived as valuable for reasons of data compression and robustness against noise, but it is less often considered that a fairly complex computational device mediates between the stored or transmitted bit stream and the display. The author explores some of the possibilities inherent in the source-to-display decoupling made possible in a digital video system, and examines the implications for both the digital video representation and the decoding device. The essential idea is to use the computation involved in both creating and decoding the bit stream to decouple the origination of the imagery from its ultimate viewing. A general-purpose CPU was combined with several specialized coprocessors and a full crosspoint switch, allowing both pipelining and parallel processing to take place between the communications channel and the display. This architecture has been realized in the Cheops Imaging System described by the author and J.A. Watlington (Proc. SPIE-Int. Soc. Opt. Eng., vol. 1605, p. 886-93 of 1991) which comprises a hardware and software architecture for processing image sequence data and structured scene representations in real time.

This publication has 13 references indexed in Scilit: