Learn Coding Concepts with Shaila

Learn different programming concepts and how to implement those.

Member-only story

LRU Cache Implementation in C++

--

What is an LRU Cache?

An LRU Cache is a data structure that stores a limited number of items. When the cache is full, it evicts the least recently used item to make room for new data.

For example:

  • Capacity: 3
  • Sequence of operations: put(1), put(2), put(3), get(1), put(4)
  • Result: Cache contains {1, 3, 4} because item 2 was the least recently used and got evicted.

How Does an LRU Cache Work?

To implement an LRU cache, we need:

  • Fast lookups: To retrieve a value using its key.
  • Fast updates: To update access order when an item is accessed or added.

We can achieve this by combining:

  • A hashmap for O(1) lookups.
  • A doubly linked list for maintaining the order of usage (most recently used to least recently used).

Implementation

Here’s how we can implement the LRU Cache:

#include <bits/stdc++.h>
using namespace std;

class LRUCache {
private:
struct Node {
int key, value;
Node* prev;
Node* next;
Node(int k, int v) : key(k), value(v), prev(nullptr), next(nullptr) {}…

--

--

Learn Coding Concepts with Shaila
Learn Coding Concepts with Shaila

Published in Learn Coding Concepts with Shaila

Learn different programming concepts and how to implement those.

No responses yet