NET 7 and Memory Cache
Enhancing API Performance with IMemoryCache
In the realm of APIs, speed is everything. That’s why I’m here to introduce you to a technique: memory caching. You can significantly boost your API’s performance by harnessing the capabilities of memory caching. How does it work? By storing frequently accessed data directly in your server’s memory, you can slash response times and reduce the strain on your database. It’s a simple yet effective way to supercharge your API’s speed and efficiency. Let’s dive in!
Memory Cache
Scenario: You have an API that returns data that rarely changes. For example, a list of cities or countries, types of something business-related, among others. These are APIs that are often requested, perform database access, and return infrequently modified data.
One way to enhance the performance of these requests and, at the same time, reduce the number of database accesses is to use memory caching.
Memory caching utilizes server memory to store data. As a result, access to this data is faster than accessing the database.
However, it’s important to be mindful that it’s suitable for one server, as in the case of multiple servers, there’s a risk that a request might hit a server where the data is not in memory.
Even so, in the case of multiple servers, it’s possible to use “sticky sessions.” With this, client requests will be routed and processed on the same server. A better solution for multiple server situations would be a distributed cache, like Redis.
IMemoryCache
IMemoryCache
is the interface used for managing memory cache data. To use it, simply call the AddMemoryCache
method from an IServiceCollection
object.
This can be done in the Program
class.
builder.Services.AddMemoryCache();
With this, your application can access the service instance via dependency injection.
The main methods to be used are:
Set(string key, TItem object)
: creates an entry in the memory cache with the keykey
and an object of generic typeTItem
, replacing it if an object already exists under the same key;Remove(string key)
: removes the object stored in the cache with the keykey
;TryGetValue(string key, out object object)
: tries to retrieve an object stored with the keykey
, and if it exists, assigns it to theobject
variable.
For example, I use RestCountry, a public API about the countries of the world. Information about it can be found here. Since this API returns A LOT of data for each country, I selected a few simple properties and created the Country
class, shown below.
public class Country
{
public string Name { get; set; }
public string Capital { get; set; }
public string Region { get; set; }
public string Demonym { get; set; }
}
Next, I implement the GetCountries
Action, accessed by the HTTP GET method for the path api/countries
.
using ArticleInMemmoryCache.Models;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Caching.Memory;
using Newtonsoft.Json;
namespace ArticleInMemmoryCache.Controllers
{
[Route("api/[controller]")]
public class ImprovedCountriesControllerWithComments : ControllerBase
{
private readonly IMemoryCache _memoryCache;
private readonly IHttpClientFactory _clientFactory;
private const string COUNTRIES_KEY = "Countries";
private readonly string restCountriesUrl = "https://restcountries.eu/rest/v2/all";
public ImprovedCountriesControllerWithComments(IMemoryCache memoryCache, IHttpClientFactory clientFactory)
{
_memoryCache = memoryCache;
_clientFactory = clientFactory;
}
[HttpGet]
public async Task<IActionResult> GetCountries()
{
// Check if the countries are already in the cache
// If so, return them
// Otherwise, fetch them from the REST API, cache them, and return
return _memoryCache.TryGetValue(COUNTRIES_KEY, out object countriesObject)
? Ok(countriesObject)
: Ok(await FetchAndCacheCountriesAsync());
}
// Method for fetching and caching countries from the REST API
private async Task<List<Country>> FetchAndCacheCountriesAsync()
{
// Use HttpClient from HttpClientFactory
var client = _clientFactory.CreateClient();
var response = await client.GetAsync(restCountriesUrl);
// Read the response as a string
var responseData = await response.Content.ReadAsStringAsync();
// Deserialize the JSON string into a list of Country objects
var countries = JsonConvert.DeserializeObject<List<Country>>(responseData);
// Define cache options
var memoryCacheEntryOptions = new MemoryCacheEntryOptions
{
// Set an absolute expiration relative to now (1 hour)
AbsoluteExpirationRelativeToNow = TimeSpan.FromSeconds(3600),
// Set a sliding expiration (20 minutes)
SlidingExpiration = TimeSpan.FromSeconds(1200)
};
// Set the countries into the cache with defined cache options
_memoryCache.Set(COUNTRIES_KEY, countries, memoryCacheEntryOptions);
// Return the countries
return countries;
}
}
}
In the example above, notice the flow:
- Check if there is a record with the key specified in the constant
COUNTRIES_KEY
. - If so, return OK with the list. This is the quickest path.
- If not, set the URL for HTTP call.
- Instantiate an
HttpClient
object. - Make the call to the defined URL, followed by deserializing it into a list of
Country
objects. - Instantiate the memory cache settings, defining an absolute expiration time (of 1 hour, or 3600 seconds), and a relative time whose count is restarted after each access (of 20 minutes, or 1200). With this, after 20 minutes without any access, the entry in memory is already removed. Or, in the case of many accesses, it will be removed in 1 hour regardless.
- Save the object returned by the API in the memory cache.
- Return OK with the list.
Considerations and Drawbacks of Memory Caching
While memory caching offers numerous benefits for enhancing API performance, it’s important to know its limitations and potential drawbacks. Here are a few key considerations to keep in mind:
- Limited Memory Capacity: Memory caching relies on the server’s memory capacity to store data. Depending on the size and complexity of your dataset, you may encounter memory limitations. If the cache becomes too large, it can lead to memory pressure, affecting overall system performance. Monitoring memory usage and ensuring your cache remains within manageable limits is crucial.
- Data Consistency: Memory caching improves response times by serving data directly from memory. However, this can introduce challenges in maintaining data consistency. As the cached data is separate from the authoritative data source, such as a database, updates made to the source may not immediately reflect in the cache. Implementing strategies for cache invalidation or expiration is essential to ensure data integrity.
- Cache Invalidation: It’s vital to handle cache invalidation effectively when using memory caching. If the underlying data changes frequently, ensuring that the cache reflects the most up-to-date information becomes more complex. You may need to implement mechanisms such as time-based expiration, event-based invalidation, or manual cache clearing to keep the cache consistent with the data source.
- Scalability with Multiple Servers: Memory caching becomes more challenging when you have multiple servers in a distributed system. In such cases, maintaining cache consistency across multiple servers can be tricky. Sticky sessions or distributed caching solutions like Redis can help mitigate these challenges, but they add complexity to your architecture.
- Cache Warm-up: When starting a new instance or after a cache reset, the cache is empty and needs to be populated. This can lead to increased response times for initial requests until the cache is warmed up. Careful consideration should be given to strategies for preloading the cache or handling the temporary performance impact during cache warm-up.
By understanding these considerations and planning accordingly, you can effectively address the potential drawbacks of memory caching and optimize its usage in your API infrastructure. With careful design and implementation, memory caching can deliver significant performance gains and improve the user experience.
Thanks for reading! Before you go:
If you find this article helpful, please consider giving claps and following me. 👏👉
Take advantage of my other stories! 👇🔽