Instant Page Loads with LRU State Cache in Flutter BloC

HM Tamim
3 min readOct 17, 2023

--

Ever wondered how popular apps manage to load pages instantly when you revisit where the data hasn’t changed? Well, this is often achieved by caching the data in memory or local storage. In this article, we’ll implement a simple LRU (Least Recently Used) caching technique within BloC for UI State.

Minimizing loading states can significantly enhance the user experience.. A real-life scenario involves using a bottom navigation bar. When users quickly switch between tabs, the page should load instantly after the first time. If necessary, APIs might be called in the background to sync the data.

The LRU Cache Solution

LRU caching is a strategy where the most recently accessed items are kept readily available, while the oldest, or least recently used items, are the first to be discarded when the cache reaches its capacity. By integrating this caching mechanism directly into our BLoC pattern, we can ensure that the most relevant states are instantly accessible.

Implementation Insights

  1. LinkedHashMap: Dart’s LinkedHashMap maintains the insertion order of its entries, making it perfect for our LRU cache. The oldest (least recently accessed) item is the first in the map, and thus the first to be removed when making space for newer entries.
  2. State Management: Whenever a state is accessed or modified, it’s re-inserted into our cache, ensuring it’s now the most recently used. When the cache limit is reached, the oldest item is removed.
  3. Integration with BLoC: On every state change or access within our BLoC, the state is either pulled from the cache (if available) or fetched from its original source and then stored in the cache.

Benefits

  • Speed: Most notably, users experience near-instantaneous loading times for frequently accessed states.
  • Efficiency: Only the most relevant states are kept in memory, preventing excessive memory usage.
  • Simplicity: With Dart’s built-in tools and Flutter’s BLoC pattern, the LRU cache integration is seamless and straightforward.

Now let’s dive into the code

Create a new BlocStateCache class

@singleton 
class BlocStateCache {
final int capacity;
final _cache = <Type, dynamic>{};

BlocStateCache({this.capacity = 10});

/* If the cache contains the state, remove it and put it back
in so that it becomes the most recently used item */
void store<T>(T state) {
if (_cache.containsKey(T)) {
_cache.remove(T);
}

// If we're over capacity, remove the least recently used item
if (_cache.length >= capacity) {
_cache.remove(_cache.keys.first);
}

_cache[T] = state;
}

/* If the state exists, we remove it and then re-insert it,
marking it as the most recently accessed. */
T? retrieve<T>() {
final state = _cache.remove(T);
if (state != null) {
_cache[T] = state;
}
return state;
}

void clear<T>() {
_cache.remove(T);
}

void clearAll() {
_cache.clear();
}
}

For DI, we are using get_it along with injectable. You can use any library you prefer or simply use the Singleton pattern.

Now you are ready to use it in your desired BloC. However, by using these two extension methods below, you can significantly reduce boilerplate, making it easier to implement caching across multiple BLoCs in your application.

extension EmitterStateCaching<S> on Emitter<S> {
void tryFromCache<State extends S>() async {
final stateCache = getIt<BlocStateCache>();
final cachedState = stateCache.retrieve<State>();
if (cachedState != null) {
this(cachedState);
}
}

void andCache<State extends S>(State state) {
this(state);
final stateCache = getIt<BlocStateCache>();
stateCache.store(state);
}
}

Finally here’s how you implement state caching in a BLoC

class ExampleBloc extends Bloc<ExampleEvent, ExampleState> {
ExampleBloc() : super(InitialState()) {
on<LoadPageEvent>((event, emit) async {
emit.tryFromCache<PageLoadState>(); // emit cached state if exists
final fetchedData = await fetchDataFromApiOrDb();
emit.andCache(PageLoadState(data: fetchedData)); // instead of emit(...)
});
}
}

There you go! Now your pages should load instantly when a user visits after the first time.

Remember: Caching is a tool, not a panacea. While it can significantly enhance performance and user experience in many scenarios, as an engineer, it’s essential for you to evaluate its suitability based on your app’s specific requirements and challenges.

Consider Caching When:

  • Data is accessed frequently.
  • Data changes are infrequent.
  • Enhancing user experience is a priority.

Proceed with Caution When:

  • Data is volatile or changes often.
  • Handling large datasets that might bloat memory.
  • Dealing with sensitive or interdependent data.

Cheers!!

--

--