Building an LRU Cache in Java: A Simple Approach Using LinkedHashMap

Weberton Faria
Just Eat Takeaway-tech

--

Introduction

In the world of software development, efficient management of resources is key to building high-performance applications. A fundamental technique in this domain is implementing a Least Recently Used (LRU) cache. Such caches are crucial in optimizing data retrieval processes and managing memory efficiently, especially in resource-constrained environments.

In this article, we’re going to walk through how to construct an LRU cache in Java, using the versatile LinkedHashMapclass.

Let’s dive in and unpack this essential technique.

What is a Cache?

A cache is a special kind of memory that stores data for a short time. It’s really fast, and its main job is to make accessing data quicker than usual. This is important because it helps avoid slow processes that happen when a computer reads or writes data.

Caching is a big deal in making computers work faster. It’s used in lots of different areas, like making websites load quickly or helping apps on your phone run smoothly. By using caching, we can make sure computers do their jobs more efficiently.

What is LRU Cache?

LRU stands for ‘Least Recently Used,’ and it’s a smart way of managing cache memory. Imagine a cache as a small box where you can only fit a limited number of items. When it gets full and you need to put something new in, LRU helps by removing the item that hasn’t been used for the longest time. This method works great in situations where you’re less likely to need old data again.

Let’s take an example: think about a website that recommends products to you. To suggest items you might like, it needs to quickly look at what you’ve recently viewed or bought. But with so many products and users, it’s impossible to store all this information in the main memory — it’s just too much.

Here’s where an LRU cache comes in handy. It keeps track of the products you’ve looked at or bought most recently. So, as you move around on the website, it can quickly use this cache to suggest other items you might like, making the whole experience faster and more personal.

How to Build an LRU Cache in Practice

The LRU Cache can be implemented using a doubly linked list combined with a HashMap. However, in Java, this implementation can be abstracted using the LinkedHashMapclass. Internally, LinkedHashMapimplements a doubly linked list to maintain the order of elements and a HashMapto store keys and values, thereby enabling quick access to elements.

Java Implementation Using LinkedHashMap

import java.util.LinkedHashMap;
import java.util.Map;

public class LRUCache<K, V> extends LinkedHashMap<K, V> {
private final int capacity;

public LRUCache(int capacity) {
super(capacity, 0.75f, true);
this.capacity = capacity;
}

@Override
protected boolean removeEldestEntry(Map.Entry<K, V> eldest) {
return size() > this.capacity;
}
}

Breaking Down the LRU Cache Code

The LRUCache class takes advantage of a built-in data structure in Java, the LinkedHashMap. This structure is pretty handy because it can keep track of the order of elements based on either when they were added (insertion order) or when they were last accessed (access order). For our LRU Cache, maintaining the access order is crucial. This means every time an item is read or written, it gets moved to the end of the list.

When we set up our LRUCache class, we start by calling the constructor of LinkedHashMapwith three important parameters: initial capacity, load factor, and access order.

  • Initial Capacity: This is all about how big our cache can be. Once the cache hits this limit, it needs to remove the least recently used item before any new ones can be added.
  • Load Factor: Think of this as a threshold that tells us how full the HashMap can get before it needs to grow bigger. We’ll set ours to 75%, which means when it’s 75% full, it’ll expand. But in our LRU Cache, we’re actually going to remove the least used items before this happens.
  • Access Order: A simple true/false setting. We set this to true because we want to keep things in order based on how recently they’ve been accessed, not when they were put in.

There’s also a crucial bit where we override the removeEldestEntry method from LinkedHashMap. This method gets called every time we add something new with putor putAll. It decides if the oldest entry (the least recently used one) should be tossed out. Our logic here is straightforward: if our map’s size is bigger than our set capacity, the oldest entry gets removed. This is the heart of the LRU Eviction process.

Using the LRUCache Class: A Practical Example

    public static void main(String[] args) {
LRUCache<Integer, Integer> cache = new LRUCache<>(2);

cache.put(1, 1);
cache.put(2, 2);
System.out.println(cache); //output {1=1, 2=2}
cache.get(1); // returns 1
System.out.println(cache); //output {2=2, 1=1}
cache.put(3, 3); // Remove 2 and add 3
System.out.println(cache); //output {1=1, 3=3}
cache.get(2); // return null
cache.put(4, 4); // Remove 1 and add o 4
System.out.println(cache); //output {3=3, 4=4}
cache.get(3); // return 3
System.out.println(cache); //output {4=4, 3=3}
cache.get(4); // return 4
System.out.println(cache); //output {3=3, 4=4}
}

Conclusion

In this article, we’ve explored how an LRU Cache can dramatically speed up data access in applications where quick responses are critical. Using Java’s LinkedHashMap to implement this algorithm offers a great mix of simplicity and effectiveness. Hopefully, you now have a clearer understanding of the LRU Cache and feel ready to apply this technique to improve your applications.

Happy coding!

Just Eat Takeaway.com is hiring! Want to come work with us? Apply today

--

--