In-Memory Cache: Use of Memory Caching In .NET Core

Mert Savaş
Berkut Teknoloji
Published in
4 min readDec 25, 2021

--

Hello everyone! Today, we’re going to look at the use of in-memory caching in .NET Core. Have a nice reading ❤

What is In-Memory Caching?

In-Memory Caching is a method used to provide faster response to incoming requests. When a request is made to retrieve data for the second time, applications can retrieve data from the cache rather than from the original source, such as a database. In this way, data will be accessed faster. The main reason for this method is, as you can see from the figure below, accessing the cache is faster than accessing mass storage or even RAM. We can develop faster and more efficient applications with the use of In-Memory Caching.

Data Caching

Data Caching means caching data from a data source such as a web service, database or file, etc. As long as the cache doesn’t expire, we can retrieve data from the cache. When the cache expires, we retrieve the data from the used data source and write it to the cache.

Some Terms That We May Need To Know

  • Cache Hit: Cache hit is the status of the requested data being in the cache.
  • Cache Miss: Cache miss is the case that the requested data is not in the cache. In this case, we send a request to the data source and fetch data from there. We write the data from the data source back into the cache.

In-Memory Cache In .NET Core

In .NET Core, we are able to write data to cache, read or delete our data from the cache using the interface named IMemoryCache in the Microsoft.Extensions.Caching.Memory library.

We may want to expire the data in some cases or prevent it from consuming more memory so that it does not render our application inoperable. We can achieve this by using the MemoryCacheEntryOptions class.

  • AbsoluteExpiration: It ensures that the data stored in the cache is destroyed at the end of the specified time.
  • ExpirationTokens: It uses a token instance to expire the cache entry.
  • Priority: Determines the order in which data is deleted to free memory, based on the CacheItemPriority enum value(Low, Normal, High, NeverRemove).
  • Size: It determines the size of the cache entry value.
  • SlidingExpiration: It determines how long the cache entry can be inactive (e.g. not accessed) before it will be removed. When we access the cache entry before it is removed, the entry lifetime will be extended. This will not extend the entry lifetime beyond the absolute expiration (if set).

Now let’s look at how In-Memory Cache is used on the sample project.

Implementation

First, let’s enable Memory Cache in the ConfigureServices method in the startup.cs class by adding services.AddMemoryCache().

Then, let’s go back to our related controller and inject the IMemoryCache interface in the constructor.

Now we are ready to use ‘In-Memory Cache’. Let this example project be a project that allows us to search for a city by plate code.

Methods

Storing Data In Cache

As seen on the picture above for the cache entry options, we give the AbsoluteExpiration time of the entry value to 1 hour, the SlidingExpiration time to 10 minutes if it is not accessed within a certain time, and the priority to high. Then we set the plate code as the key , the city information as the value and the cacheExpiryOptions object as the MemoryCacheEntryOptions.

Fetcing Data From Cache

We send the plate code as the key to pull the data from the cache. If it finds the data in the cache, we set the returned data to value.

Removing Data From Cache

We may want to remove data from cache before it is self-destructed. For these cases, we can easily implement this code block in our code base.

In this article, I tried to explain and apply in-memory caching as best that I could. I hope you enjoyed it. Bye ❤

References

--

--