Overview of caching in .NET 8 API

Chaitanya (Chey) Penmetsa
CodeNx
Published in
5 min readApr 19, 2024

In this blog let’s look at different caching options available in .NET 8 with respect to an API. In this blog we will understand all the options available and compare them with each other. In future blogs we will see code examples of each of the caching option.

Why do we need caching?

Caching is a fundamental technique used in software development to improve the performance, scalability, and efficiency of applications. Here are several reasons why caching is essential with real world analogies:

  • Faster Access — Think of caching like storing your favorite snacks in a cupboard right next to you instead of going to the store every time you’re hungry. When data is cached, it’s kept close to you or your device, so it’s quicker to grab. For instance, when you visit a website, instead of waiting for every piece of information to be fetched from far-away servers, some of it is already stored nearby, making the website load faster.
  • Reduced Waiting Time — Imagine waiting for a package to arrive from a distant location versus picking it up from a nearby store. Caching stores important information closer to you, like having a local store for your most frequently needed items. So, when you need that information again, you don’t have to wait for it to travel long distances, making everything faster and smoother.
  • Saving Energy — Just like preparing food in bulk saves time and energy, caching saves resources by storing data in a handy location. It’s like having a pantry stocked with your most-used ingredients, so you don’t have to keep going to the store. This reduces strain on servers and helps them work more efficiently, especially when lots of people are accessing the same information at once.
  • Reliability — Picture having a backup plan when your usual route to work is blocked. Caching acts as a backup by storing important data nearby. So, even if there’s a problem with the main system, you can still access the cached data, ensuring that your apps and websites keep running smoothly without interruptions.
  • Better User Experience — Just like how having a well-organized kitchen makes cooking more enjoyable, caching improves your experience with apps and websites. When things load quickly and smoothly, it feels like everything is working perfectly. You don’t have to wait, and you can get things done without frustration, which keeps you coming back for more.

Below are different caching options available in .NET API:

  • In-memory Cache — MemoryCache is a simple in-memory caching mechanism available in the System.Runtime.Caching namespace. It allows you to store key-value pairs in memory for a specified duration or until the system runs out of memory. MemoryCache is suitable for caching data within a single application instance. This type of caching is suitable for a single server or multiple servers using session affinity. Session affinity is also known as sticky sessions. Session affinity means that the requests from a client are always routed to the same server for processing.
  • Distributed Caching — Distributed caching is used to share cached data across multiple instances of an application running on different servers or nodes. .NET provides support for distributed caching through libraries like Microsoft.Extensions.Caching.Distributed, which includes implementations using Redis, SQL Server, and other distributed cache providers.
  • Response Caching Middleware — In ASP.NET Core, response caching middleware allows you to cache HTTP responses at the HTTP level. This middleware caches responses based on specified cache-control directives or other criteria, such as query parameters or request headers. It’s useful for caching entire HTTP responses to reduce server load and improve response times.
  • Output Caching — Output caching, available in ASP.NET MVC and ASP.NET Web Forms, allows you to cache the output of entire web pages or user controls. You can configure output caching at the controller or action level using attributes like OutputCacheAttribute, or programmatically using the OutputCache class.

Now let’s look at pros and cons of each types of caching:

  • MemoryCache:
    Pros:
    Fast access — MemoryCache stores data in the application’s memory, providing extremely fast access times.
    Simple setup — It’s easy to set up and use, requiring minimal configuration.
    Fine-grained control — You can control caching behavior, such as expiration policies and memory limits, to suit your application’s needs.
    Cons:
    Limited scalability — Since MemoryCache is local to each application instance, it doesn’t scale well in distributed environments with multiple servers or nodes.
    Memory usage — Caching large amounts of data in MemoryCache can consume significant amounts of memory, potentially impacting the application’s performance.
  • Distributed Caching:
    Pros:
    Scalability — Distributed caching allows you to share cached data across multiple instances of an application, improving scalability and performance in distributed environments.
    Fault tolerance — Distributed caching solutions often include features for replication and failover, ensuring data availability even in the event of node failures.
    Consistency — Distributed caches often provide mechanisms for cache invalidation and consistency maintenance across nodes.
    Cons:
    Complexity — Setting up and configuring distributed caching solutions can be more complex compared to local caching mechanisms.
    Network overhead — Accessing cached data over the network can introduce additional latency and overhead compared to in-memory caching.
  • Response Caching Middleware:
    Pros:
    HTTP-level caching — Response caching middleware caches entire HTTP responses, reducing server load and improving response times for subsequent requests.
    Fine-grained control — You can configure caching behavior based on various criteria, such as cache-control directives, query parameters, or request headers.
    Cons:
    Limited applicability — Response caching middleware is specific to HTTP responses in ASP.NET Core applications and may not be suitable for caching other types of data or in non-web scenarios.
  • Output Caching:
    Pros:
    Granular control — Output caching allows you to cache specific parts of web pages or user controls, providing fine-grained control over caching behavior.
    Integration with ASP.NET MVC and Web Forms — Output caching is integrated into the ASP.NET MVC and Web Forms frameworks, making it easy to implement caching for web applications.
    Cons:
    Limited to web applications — Output caching is primarily designed for caching web page output and may not be suitable for caching other types of data or in non-web scenarios.

With this we conclude this blog and we will look at each of them with examples in future blogs.

🙏Thanks for taking the time to read the article. If you found it helpful and would like to show support, please consider:

  1. 👏👏👏👏👏👏Clap for the story and bookmark for future reference
  2. Follow me on Chaitanya (Chey) Penmetsa for more content
  3. Stay connected on LinkedIn.

Wishing you a happy learning journey 📈, and I look forward to sharing new articles with you soon.

--

--

Chaitanya (Chey) Penmetsa
CodeNx
Editor for

👨🏽‍💻Experienced and passionate software enterprise architect helping solve real-life business problems with innovative, futuristic, and economical solutions.