Cache

PB
SystemDesign.us Blog
5 min readJul 28, 2022

Visit systemdesign.us for System Design Interview Questions tagged by companies and their Solutions. Follow us on YouTube, Facebook, LinkedIn, Twitter, Medium, Notion, Quora.

https://aws.amazon.com/caching/

Caching is a technique that stores data in a temporary location so that it can be accessed more quickly. When you request data from a server, the server looks for the requested data in its cache before looking for it elsewhere. This can help to improve performance because it reduces the amount of time that is needed to access the data. Caching can happen at any level shown above in the diagram, including browser caching and server caching.

Browser Caching

Browser caching is a type of caching that occurs at the client level. When you visit a website, your browser stores certain files on your computer’s hard drive. These files can include HTML files, CSS files, JavaScript files, and images. The next time you visit the same website, your browser can load the files from your hard drive instead of requesting them from the server. This can help to improve performance because it reduces the amount of data that needs to be transferred.

CDN Caching

A content delivery network (CDN) is a type of caching that occurs at the network level. A CDN is a group of servers that are located in different parts of the world. When you request data from a website, the CDN redirects your request to the server that is closest to you. This can help to improve performance because it reduces the amount of time that is needed to transfer the data.

Server Caching

Server caching is a type of caching that occurs at the server level. When a server receives a request for data, it looks for the requested data in its cache before looking for it elsewhere. If the data is found in the cache, it can be returned to the user without having to retrieve it from another source. This can help to improve performance because it reduces the amount of time that is needed to access the data.

Different types of server caching include page caching, database caching, and object caching.

Page Caching

Page caching is a type of server caching that stores entire pages in a cache. When a user requests a page, the server looks for the page in the cache before looking for it elsewhere. If the page is found in the cache, it can be returned to the user without having to retrieve it from another source. This can help to improve performance because it reduces the amount of time that is needed to access the page.

Database Caching

Database caching is a type of server caching that stores data in a database. When a user requests data, the server looks for the data in the cache before looking for it in the database. If the data is found in the cache, it can be returned to the user without having to retrieve it from the database. This can help to improve performance because it reduces the amount of time that is needed to access the data.

Object Caching

Object caching is a type of server caching that stores data in an object. When a user requests data, the server looks for the data in the cache before looking for it elsewhere. If the data is found in the cache, it can be returned to the user without having to retrieve it from another source. This can help to improve performance because it reduces the amount of time that is needed to access the data.

Cache Invalidation

Cache invalidation is the process of removing data from a cache. There are different schemes that can be used to invalidate data, including time-based invalidation and event-based invalidation.

Time-Based Invalidation

Time-based invalidation is a scheme that removes data from a cache after a certain amount of time has passed. This can help to ensure that the data in the cache is up-to-date.

Event-Based Invalidation

Event-based invalidation is a scheme that removes data from a cache when an event occurs. This can help to ensure that the data in the cache is up-to-date. Events that can trigger cache invalidation include data changes, system failures, and application crashes.

When cache should be updated?

Write-through cache

Write through cache is a type of caching where data is written to the cache and the source simultaneously. This can help to improve performance because it reduces the amount of time that is needed to write the data.

Write-back cache

Write back cache is a type of caching where data is first written to the cache and then later written to the source. This can help to improve performance because it reduces the amount of time that is needed to write the data.

Write-around cache

Write around cache is a type of caching where data is written to the source and not the cache. This can help to improve performance because it reduces the amount of time that is needed to write the data.

Read through cache

Read through cache is a type of caching where data is read from the cache and the source simultaneously. This can help to improve performance because it reduces the amount of time that is needed to read the data.

Cache Eviction

Cache eviction is the process of removing data from a cache to make room for new data. There are different policies that can be used to evict data.

Least Recently Used (LRU)

The LRU policy evicts the data that has been least recently used. This can help to ensure that the most frequently used data is kept in the cache.

First In First Out (FIFO)

The FIFO policy evicts the data that was first added to the cache. This can help to ensure that the most recent data is kept in the cache.

Least Frequently Used (LFU)

The LFU policy evicts the data that is least frequently used. This can help to ensure that the most frequently used data is kept in the cache.

Random Replacement (RR)

The RR policy evicts the data randomly. This can help to ensure that all data has an equal chance of being evicted.

Visit systemdesign.us for System Design Interview Questions tagged by companies and their Solutions. Follow us on YouTube, Facebook, LinkedIn, Twitter, Medium, Notion, Quora.

--

--