HTTP Caching

Caching is a technique used in HTTP to store a copy of a given resource and serve it back when requested. Caching allows efficient reuse of the previously retrieved or computed data.

Cache stores data in fast access hardware. A cache storing of data is typically temporary, unlike database data that are complete and durable. The data retrieval performance reduces the need to access the underlying slower storage layer.

Caches can be private and public. Public cache stores responses to several users and they are also called shared proxy caches. A private browser cache serves a single user.

The browser has a lot of different types of caches for distinct, specific uses:
  • Image cache - a page-scoped cache that stores decoded image data.
  • Preload cache - like the image cache, is page-scoped and destroyed when the user leaves the page.
  • Service worker cache API - origin-scoped; provides a cache back end with a programmable interface.
  • HTTP cache - uses cache headers such as Cache-Control to determine the need and duration for caching.
  • HTTP/2 push cache (or 'H2 push cache') - stores objects that have not been used by the server already but not yet been requested by any page using the connection; destroyed when the connection closes.

To determine whether a request can be satisfied using a cached copy of the requested resource, a cache policy specifies client requirements for cache freshness and the server requirements for revalidation. The freshness of cached entries is based on the location of the requested resource; on time of the resource retrieval, on headers returned with a response and the current time.

Live Request Example with cache-control Header Live Request
GET /echo/get/json HTTP/1.1
Host: reqbin.com
Cache-Control: max-age=3600, public, no-transform

In general, Caching lightens the burden of network operations and reduces bandwidth requirements.