Adding caching to any component with IoC in .NET

An ultra-sleek strategy with zero impact on the original code

Abnoan Muniz
.Net Programming
6 min readNov 15, 2023

--

Image created with DALL-E about IoC

We’ll showcase how to apply the Decorator design pattern using a strategy to intercept a component, inject new behaviors, and still execute the original component.

Performance

If performance could be summed up in one word, CACHE would take the crown.

There are numerous ways to rev up performance:

  • Optimizing code.
  • Enhancing database queries.
  • Creating an index.
  • Improving algorithms.
  • Leveraging parallel programming.
  • Boosting with more hardware. Yet, when your system scales up, handling zillions of requests per minute, you’ll have to revisit your app’s caching strategy.

The Problem

Consider this snippet:

[ApiController, Route("cars")]
public class CarController : ControllerBase
{
private readonly ICarStore _store;
public CarController(ICarStore store)
{
_store = store;
}

[HttpGet]
public IActionResult Get()
{
return Ok(_store.List());
}
}

Notice how the [Controller] gets an injection of an [ICarStore], which hits the database, fetches data, and returns a response.

Ponder this:

How to add caching without tweaking the Controller or the CarStore?

Suppose the [ICarStore] implementation is in a component beyond your reach, like a Nuget library, and the SQL query it's running is as slow as molasses.

The Technique It’s all about the [Decorator] design pattern.

This two-step strategy:

  1. Cook up a caching class that gets an injection of [ICarStore] and also implements the same interface.
  2. Register the concrete class implementing [ICarStore] and swap the DI registration.

First Step

Peek at the implementation below:

public class CarCachingStore<T> : ICarStore
where T : ICarStore
{
private readonly IMemoryCache _memoryCache;
private readonly T _inner;
private readonly ILogger<CarCachingStore<T>> _logger;

public CarCachingStore(IMemoryCache memoryCache, T inner, ILogger<CarCachingStore<T>> logger)
{
_memoryCache = memoryCache;
_inner = inner;
_logger = logger;
}

public IEnumerable<Car> List();
public Car Get(int id);
}

This component takes a <T> that implements the [ICarStore] interface while also embodying the same.

Second Step

Somewhere in the code, there’s a DI registration for [ICarStore] and its implementation.

public void ConfigureServices(IServiceCollection services)
{
services.AddControllers();
services.AddScoped<ICarStore, CarStore>();
}

We need to dabble with ASP.NET Core’s DI mechanism.

public void ConfigureServices(IServiceCollection services)
{
services.AddControllers();
RegisterServices(services);
EnableCache(services);
}

private static void RegisterServices(IServiceCollection services)
{
services.AddScoped<ICarStore, CarStore>();
}

private void EnableCache(IServiceCollection services)
{
services.AddScoped<CarStore>();
services.AddScoped<ICarStore, CarCachingStore<CarStore>>();
}

First, we associate [ICarStore] with its original implementation [CarStore].

Then, we register the concrete [CarStore] and redo the [ICarStore] registration, but this time, linking it to the previously crafted component.

Remember, this is just a demo. Typically, there’s logic to enable or disable this swap.

Below is the complete CarCachingStore code:

public class CarCachingStore<T> : ICarStore
where T : ICarStore
{
private readonly IMemoryCache _memoryCache;
private readonly T _inner;
private readonly ILogger<CarCachingStore<T>> _logger;

public CarCachingStore(IMemoryCache memoryCache, T inner, ILogger<CarCachingStore<T>> logger)
{
_memoryCache = memoryCache;
_inner = inner;
_logger = logger;
}

public IEnumerable<Car> List()
{
var key = "Cars";
var item = _memoryCache.Get<IEnumerable<Car>>(key);

if (item == null)
{
item = _inner.List();
if (item != null)
{
_memoryCache.Set(key, item, TimeSpan.FromMinutes(1));
}
}

return item;
}

public Car Get(int id)
{
var key = GetKey(id.ToString());
var item = _memoryCache.Get<Car>(key);

if (item == null)
{
_logger.LogTrace("Cache miss for {cacheKey}", key);
item = _inner.Get(id);
if (item != null)
{
_logger.LogTrace("Setting item in cache for {cacheKey}", key);
_memoryCache.Set(key, item, TimeSpan.FromMinutes(1));
}
}
else
{
_logger.LogTrace("Cache hit for {cacheKey}", key);
}

return item;
}


private static string GetKey(string key)
{
return $"{typeof(T).FullName}:{key}";
}
}

Use Cases

This strategy enables caching without changing the original code, as shown.

The app only knows an interface and doesn’t need to determine when data comes from cache. This dodges implementations like:

  • ICarCacheStore
  • ICarStore

Plus, you can toggle caching based on the environment — cache in production and always hit the DB in development.

Another use case is for logging in components.

Who’s on Board?

The caching strategy we’ve described is in active use within industry applications. It’s a proven method for enhancing performance without the need to rework the foundational code.

IdentityServer4 is an example of a platform employing this approach, facilitating caching through its [Builder]. This underscores the strategy’s viability and its potential to be a powerful tool in practical software development scenarios.

Comparison with Other Caching Strategies

When considering adding caching to an application, understanding the landscape of caching strategies is a must-have for every developer. As outlined in the article, the decorator-based caching approach is one of several available methodologies. Each has its strengths and weaknesses, and the specific requirements and constraints of the application should dictate the choice among them.

Decorator-Based Caching

Decorator-based caching, as implemented through the Decorator design pattern in .NET is an elegant solution that allows developers to add caching functionality to existing components with minimal code intrusion. This method involves creating a new class that wraps the original class and intercepts method calls to add the caching layer.

Pros:

  • Non-Intrusive: Enhances classes without modifying their code.
  • Isolation: Separates caching logic from business logic.
  • Flexibility: Easily switches caching on or off, or changes the caching strategy without affecting the original class.
  • Conformity: The decorator class conforms to the interface of the original class, ensuring compatibility.

Cons

  • Complexity: Can increase complexity with additional classes and interfaces.
  • Single Responsibility Violation: The caching class may do more than just caching if not carefully designed.
  • Overhead: Additional object creation and method call overhead can occur.

Proxy Caching

Proxy caching stands between the client and the server, caching the output of requests to reduce the number of times a request must be fully processed.

Pros

  • Centralization: Offers a centralized caching mechanism for multiple applications.
  • Reduced Load: Reduces load on the original server by serving cached responses.
  • Transparency: The presence of a cache is typically transparent to end-users and applications.

Cons

  • Latency: Can introduce network latency if the proxy server is not near the client.
  • Limited Scope: It may not be suitable for user-specific data caching.

Client-Side Caching

Client-side caching stores data on the client’s device, such as in a web browser or mobile app, allowing for immediate access to cached data without a network request.

Pros

  • Performance: Greatly improves user experience by reducing loading times.
  • Offline Access: Can provide data access even when the client is offline.

Cons

  • Storage Limitations: Limited by the storage capacity of the client’s device.
  • Security: Sensitive data may be at risk if not properly secured on the client side.

Distributed Caching Solutions

Distributed caching systems, like Redis or Memcached, provide a way to store data across multiple servers, which can be a boon for scalability and fault tolerance.

Pros

  • Scalability: Can handle large volumes of data and high throughput.
  • Availability: Provides high availability and fault tolerance.

Cons

  • Complexity: Requires managing an additional distributed system.
  • Consistency: Ensuring data consistency across nodes can be challenging.

Final Thoughts

The integration of AI and machine learning will significantly influence the future of caching strategies. These technologies promise to bring about self-optimizing caching systems that can analyze access patterns and adjust fetching or eviction policies predictively. Imagine a caching system that not only adapts to changing usage patterns but also anticipates them, reducing the need to tune caching parameters manually.

Machine learning could enable more sophisticated invalidation strategies, determining the optimal times for cache eviction based on historical data, potentially increasing efficiency and performance. The goal would be a system that pre-emptively refreshes cache entries right before they are needed, thus providing faster access to data.

The implementation of such intelligent systems should keep transparency and maintainability in focus. Despite the complexity that AI algorithms may introduce, the caching system should not become opaque. Instead, it should be a comprehensible and well-documented part of the application’s architecture.

Middleware services might be developed to interface with the caching layer, offering a seamless performance enhancement through AI-driven optimization. This melding of caching and AI is not just a theoretical exercise; it represents the next step in the evolution of application development.

Such advancements will likely become an integral part of a developer’s toolkit, redefining the standards for application performance optimization. As AI and machine learning continue to mature, they will play a pivotal role in designing and implementing caching mechanisms, leading to systems that are more intelligent and more attuned to the needs of both the application and its users.

--

--

Abnoan Muniz
.Net Programming

Senior .NET Developer, Passionate about problem-solving. Support me: https://ko-fi.com/abnoanmuniz, Get in touch: linktr.ee/AbnoanM