Data compression
- 1 September 1987
- journal article
- Published by Association for Computing Machinery (ACM) in ACM Computing Surveys
- Vol. 19 (3) , 261-296
- https://doi.org/10.1145/45072.45074
Abstract
This paper surveys a variety of data compression methods spanning almost 40 years of research, from the work of Shannon, Fano, and Huffman in the late 1940s to a technique developed in 1986. The aim of data compression is to reduce redundancy in stored or communicated data, thus increasing effective data density. Data compression has important application in the areas of file storage and distributed systems. Concepts from information theory as they relate to the goals and evaluation of data compression methods are discussed briefly. A framework for evaluation and comparison of methods is constructed and applied to the algorithms presented. Comparisons of both theoretical and empirical natures are reported, and possibilities for future research are suggested.Keywords
This publication has 45 references indexed in Scilit:
- Interval and recency rank source coding: Two on-line adaptive variable-length schemesIEEE Transactions on Information Theory, 1987
- Bounds on the redundancy of Huffman codes (Corresp.)IEEE Transactions on Information Theory, 1986
- A locally adaptive data compression schemeCommunications of the ACM, 1986
- Data compression on a database systemCommunications of the ACM, 1985
- Algorithms for adaptive Huffman codesInformation Processing Letters, 1984
- Is text compression by prefixes and suffixes practical?Acta Informatica, 1983
- Fast Algorithms for Manipulating Formal Power SeriesJournal of the ACM, 1978
- Universal codeword sets and representations of the integersIEEE Transactions on Information Theory, 1975
- A Huffman-Shannon-Fano codeProceedings of the IEEE, 1973
- Graph theoretic prefix codes and their synchronizing propertiesInformation and Control, 1969