Response caching in .NET Core
Recently, in one of the project that I was part of we encountered some issues related to long response time. In the beginning, despite the large number of HTTP calls, the application worked perfectly and everything looked good… until the database became more and more populated with data and then requests started to take too much time to respond.
.NET has built-in solution for response caching but in my case that didn't work because requests had Authorization
header (conditions for caching) and library didn't cache those responses.
To solve this problem I came up with a solution that uses caching techniques and later on I turned it into a generic solution and created a library [Check Here]. The library provides two options of caching:
- Memory caching
- Distributed caching
Solution
When I started to implement the solution I was thinking to do it on one of the mentioned ways
- Creating a generic service and injecting it on controllers,
- Creating a middleware combined with caching rules,
- Using the power of
[Attributes]
.
After some researches made, I decided to use the [Attributes]
. Below I am going to describe the implementation step-by-step.
First we create a simple contract that holds two methods, one for storing the response and another for reading cached response.
As I mentioned above the library provides two options of caching — Memory and Distributed caching. So for each one I've created separated services that implements [ICacheService]
.
To be able to use distributed cache I created a new class that holds ConnectionString
of server (Redis or something else) where the data will be add.
Also I've created another service that implements memory cache technique.
Now let's get to the main part of the library — creating new attribute called [ResponseCacheAttribute]
. Our custom attribute inherits both Attribute
class and IAsyncActionFilter
interface and implements the method OnActionExecutionAsync
which holds two parameters:
ActionExecutingContext
— provides context data to the filter and in our case is used to get an instance ofICacheService
.ActionExecutingDelegate
— contains the action method (or the next filter) to be executed which allows us to check the response of next action (executedContext.Result
) and based on it to decide caching or not.
First we create a unique key per request and then check if that key holds an already cached response, if so we simply return that response otherwise we let the delegate to be executed and then if the context was executed successfully we call our method to cache response.
The method CacheResponse
accepts three arguments
- Key — unique, generated from request
- Response
- TimeToLive — this argument is defined on attribute
To use the services of the library we need to create extension methods AddDistributedResponseCache
and AddInMemoryResponseCache
which can be used on your Startup.cs
.
Setup
Nuget Install
Install-Package ResponseCache
Memory cache configuration
Cache responses on memory cache by simply adding service for memory cache and our custom serviceAddInMemoryResponseCache
.
Distributed Cache configuration — Redis
To use distributed cache first we need to add distributed cache connection string on appsettings.json
and bind it to CacheSettings
class.
"CacheSettings": {
"ConnectionString": "localhost:6379"
}
after that add distributed cache services (Redis in this case) and finally add our custom serviceAddDistributedResponseCache
.
Usage
Use the [ResponseCache(TTL)]
attribute on any action/endpoint you want. TTL
is an integer representing cache living time in seconds.
Example
Here is an example that cache response list of products for 1 minute. As you can see we've put [ResponseCache(60)]
right above List()
action.
Conclusion
The average user expects the web page to load within 2 seconds some of them will wait up to 10 seconds before they leave the page. So always consider performance and scalability of your application because when application is performing and scaling good it will definitely be stable and pages will be loaded within 2 seconds for sure.
If you liked this article and want to hear more you can find me at