LRU(Least Recently Used) Cache

Mayil Bayramov
4 min readSep 4, 2023

--

What is Caching ? Why we need it ?

Caching is one of modern techniques to access data in fast way and also reduce a number of reads from directly disk storage. Another example, imagine you have some sort of operations that requires costly computation every time when you execute it. To apply caching strategy for these kind of or similar scenarios can lead to significant improvement in your application.

Can we cache our entire database ?

As caching has major performance improvements , you can ask question: can we load all data from database to cache ? Technically it is possible , but as our memory size is so small by comparing to disk storage , in some point there is possibility our application can run out of memory. Therefore , it is recommended that to cache some significant portion of whole data rather than entire database .

What is LRU cache ?

In this article we will mainly focus on LRU cache , its working strategy and use cases. LRU cache used to keep track of the most recently accessed items and remove the least recently used items when the cache reaches its capacity limit. For simplicity, I’ll explain over one example. Most probably each of us has sent emoji to our friend via social media application at least once. It would be perfect from user experience side when user clicks on emoji section in the keyboard would be able to see for ex. last 20 used emoji list in the keyboard. In most social applications you can see recent used emojis in the first tab.

emojis
img-2: emojis

Let’s think about it , which kind of data structure we can use for this case to load data in efficient way.

Choosing appropriate data structure(s)

Let’s assume that our data structure will cache last 20 used emojis. We need to find such data structure that insertion time should be constant(O(1) time complexity) and keeps our data ordered. We can introduce LinkedList DS to cover this expectation.

img-2: LinkedList

When new emoji has been sent , first we need to check that is emoji exists in our cache ? if yes add it to the head(remove node from current index and add that node to head). Adding new element to LinkedList(head or tail) takes constant time O(1) that is cool😎. But for lookup operation LinkedList will give us linear O(n) time complexity that is not performant for large dataset lookups.

We also know that key-value based collections(or dictionaries) give us constant lookup 0(1) time complexity. Then we can combine two powerful data structures LinkedList + HashMap together to access recently used data in efficient way.

LRU Cache Internal Architecture

Diagram in the below(img-3) illustrates internal architecture of LRU Cache.

img-3: LRU cache architecture illustration

Imagine we have Emoji class illustrated like this

public class Emoji {
private String unicode;
private String name;
private String category;

public Emoji(String unicode){
this.unicode = unicode;
}

//other constructors ...
//getter, setters ...
}

LRUCache(int capacity) — we will provide capacity while initialize our cache object.

Emoji get(int key) — it will lookup Emoji object for given key. Internally it will lookup in the HashMap<Integer, Emoji>() to check data is exists in the cache or not. If exists then that Emoji node will be added to head of LinkedList<>().

void put(int key, Emoji value) — principally, put() does UPSERT operation.

  1. First it will update value if given Emoji object exists in the cache
  2. If given Emoji object doesn’t exists in the cache, new Emoji node will be created and added head of LinkedList. If the number of keys exceeds the capacity from this operation, object in the tail will be deleted from LinkedList and also that object’s key will be deleted from HashMap().

Visual illustration how it works for given code snippet in the below

LRUCache lruCache = new LRUCache(4);
lruCache.put(618, new Emoji("U+1F618")); //😘
lruCache.put(648, new Emoji("U+1F648")); //🙈
lruCache.put(440, new Emoji("U+1F440")); //👀
lruCache.put(648, new Emoji("U+1F648")); //🙈
lruCache.put(160, new Emoji("U+1F60E")); //😎
lruCache.put(144, new Emoji("U+1F44F")); //👏
img-4:visual-illustration

Thank you for taking the time to read my articles! You can discover the implementation details in the next article

If you find this article helpful for you , your support such as like, comment, sharing like rocket fuel for my technology passion. It motivates me to dive deeper into the tech world and deliver more content that matters to you.

If you’re looking to offer more support, consider buying me a coffee, and I’ll transform that caffeine boost into lines of code and tech insights. Your coffee becomes the code that powers our tech journey. ☕💻🚀 https://www.buymeacoffee.com/mayilb95

Let’s connect in Linkedin Mayil-Bayramov

--

--