The basic concept behind caching is to store content close to the requesting device. That content renders more quickly and efficiently because it traverses a shorter path. The request doesn’t have to go all the way back to the original content server. Along with speeding up the request, caching also reduces traffic to your origin server.

The Internet standard way of caching of objects is designated by two components: time-to-live, and cache control headers.

Time-to-live (TTL)

TTL is the amount of time an edge server holds an object in its cache, without checking your origin server to see if it has changed.

When a client requests content, edge servers compare the timestamp of the content in cache with the current time. If the content is older than the TTL, the edge server requests an updated copy from the origin, caches the object, and then sends it in the response to the client.

Cache control headers

Methods for controlling cacheable objects are built into the HTTP standard. These "cache control headers" allow you to specify:

  • When a piece of content expires.
  • How the content can be cached (for example, by the customer's browser or by the CDN itself).
  • Whether the content can be cached at all.

The most basic method to configure caching for content delivery is the Caching behavior in Property Manager, but additional methods are available in other Akamai products.