Buffering and caching in large-scale video servers
- 19 November 2002
- proceedings article
- Published by Institute of Electrical and Electronics Engineers (IEEE)
Abstract
Video-on-demand servers are characterized by stringent real-time constraints, as each stream requires isochronous data playout. The capacity of the system depends on the acceptable jitter per stream (the number of data blocks that do not meet their real-time constraints). Per-stream read-ahead buffering avoids the disruption in playback caused by variations in disk access time and queuing delays. With heavily skewed access patterns to the stored video data, the system is often disk arm-bound. In such cases, serving video streams from a memory cache can result in a substantial reduction in server cost. In this paper, we study the cost-performance trade-offs of various buffering and caching strategies that can be used in a large-scale video server. We first study the cost impact of varying the buffer size, disk utilization and the disk characteristics on the overall capacity of the system. Subsequently, we study the cost-effectiveness of a technique for memory caching across streams that exploits temporal locality and workload fluctuations.Keywords
This publication has 4 references indexed in Scilit:
- Scheduling policies for an on-demand video server with batchingPublished by Association for Computing Machinery (ACM) ,1994
- Designing an on-demand multimedia serviceIEEE Communications Magazine, 1992
- System architecture for a large scale video on demand serviceComputer Networks and ISDN Systems, 1991
- The coming revolution in interactive digital videoCommunications of the ACM, 1989