The determination of efficient record segmentations and blocking factors for shared data files
- 1 September 1977
- journal article
- Published by Association for Computing Machinery (ACM) in ACM Transactions on Database Systems
- Vol. 2 (3) , 279-296
- https://doi.org/10.1145/320557.320574
Abstract
It is generally believed that 80 percent of all retrieval from a commercial database is directed at only 20 percent of the stored data items. By partitioning data items into primary and secondary record segments, storing them in physically separate files, and judiciously allocating available buffer space to the two files, it is possible to significantly reduce the average cost of information retrieval from a shared database. An analytic model, based upon knowledge of data item lengths, data access costs, and user retrieval patterns, is developed to assist an analyst with this assignment problem. A computationally tractable design algorithm is presented and results of its application are described.Keywords
This publication has 3 references indexed in Scilit:
- Mathematical Techniques for Efficient Record Segmentation in Large Shared DatabasesJournal of the ACM, 1976
- Solving Bicriterion Mathematical ProgramsOperations Research, 1967
- Flows in NetworksPublished by Walter de Gruyter GmbH ,1963