Globally distributed content delivery
Top Cited Papers
- 7 November 2002
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Internet Computing
- Vol. 6 (5) , 50-58
- https://doi.org/10.1109/mic.2002.1036038
Abstract
When we launched the Akamai system in early 1999, it initially delivered only Web objects (images and documents). It has since evolved to distribute dynamically generated pages and even applications to the network's edge, providing customers with on-demand bandwidth and computing capacity. This reduces content providers' infrastructure requirements, and lets them deploy or expand services more quickly and easily. Our current system has more than 12,000 servers in over 1,000 networks. Operating servers in many locations poses many technical challenges, including how to direct user requests to appropriate servers, how to handle failures, how to monitor and control the servers, and how to update software across the system. We describe our system and how we've managed these challenges.Keywords
This publication has 6 references indexed in Scilit:
- Web caching and content distribution: a view from the interiorComputer Communications, 2001
- Consistent hashing and random treesPublished by Association for Computing Machinery (ACM) ,1997
- A Border Gateway Protocol 4 (BGP-4)Published by RFC Editor ,1995
- Traceroute Using an IP OptionPublished by RFC Editor ,1993
- Autonet: a high-speed, self-configuring local area network using point-to-point linksIEEE Journal on Selected Areas in Communications, 1991
- A Survey of Distributed File SystemsAnnual Review of Computer Science, 1990