Deep Dive Into Caching in Mule 4

Manisha Patil
Another Integration Blog
7 min readJun 27, 2024

Are you someone who struggling to get quick response from your request? Are they taking so much time to respond? Are you hitting database so frequently just to fetch response of identical requests? Then it’s time to move your focus on improving performance of your mule applications. Isn’t it right? There are many ways we should consider when it comes to highly efficient, highly performant API’s but the most common and simple thing to do in such a case is to apply caching strategies to your APIs. Let’s deep dive into what exactly cache is, what cache strategies are and how one can apply those and get an improved performance for API’s.

Caching

The first question that comes to our mind is what exactly this cache scope is? Let me tell you in very simple terms that Cache scope is for storing and reusing frequently called data. Cache is nothing but software or hardware component that stores data and serves quickly when requested for future times within an expiration interval. You can use a Cache scope to reduce the processing load on the Mule instance and to increase the speed of message processing within a flow. It is particularly effective for-

· Processing repeated requests for the same information.

· Processing requests for information that involve large repeatable streams.

In short, Cache scope helps us to eliminate those back n forth calls to third party systems. For example, if we want to fetch employee information of a specific employee from database and we have used cache scope then will hit database only for the first time then afterwards will get the same response data from cached response.

Types of Caching:

Here are two main approaches you can consider:

1. HTTP Caching Policy in API Manager (Server-side caching): This approach involves caching the entire API response. By utilizing the “HTTP Caching” policy in API Manager, you can store reusable responses and improve subsequent request times.

2. Cache Scope or Object Store Connector (Client-side caching): If you want to cache specific back-end responses within your API code, consider using the “Cache Scope” or “Object Store” Connector. These allow you to cache specific endpoints based on your requirements.

Caching categories

How caching works:

Whenever any request arrives at API the first thing that is checked is if the payload is consumable/non-repeatable stream. If the payload is a non-repeatable stream (or consumable), the caching WILL NOT WORK.

Initially data check happens in a cache. If we decided to go with default caching strategy which is nothing but saving responses in In-memory. It will generate a key using SHA256 key generator + SHA digest. Response gets stored in form of value for this key so it’s a key value pair gets stored in cache. Now as request arrives check happens with the key if same key is presents in cache, it means It’s a cache hit but if key does not exist in cache, then it’s a cache miss in this case, actual call to backend system happens and this response gets stored in form of key value pair in cache for future requests. However, if data found in cache requests got served quickly than hitting the actual backend system results in improved performance/faster retrieval.

Caching Is applied only for HTTP GET requests. It serves the major purpose of enhancing speed by delivering content quickly.

Cache Miss vs Cache Hit: Know the Difference

Caching Options:

Caching Categories

· Default storage provides In-memory storage. Memory is volatile here.

· When you use Object Store connector for caching it comes with 2 options in-memory as well as persistent options where data will get stored in file systems. If you don’t want it in local node then OS V2 service is there which is cloud hub service, and it stores data into that service but it’s available for cloud hub deployed applications.

· Also, you can use external/custom cache like REDIS, other storage systems, etc.

How caching is beneficial:

Caching is an effective strategy that can reduce load, latency, and enhance performance in Mule 4. It has several advantages as below:

1. Improves Performance by responding to requests quickly.

2. In case of downtime of application helpful to keep services running by serving the response which is available in cache.

3. Reduces unnecessary triggering to backend systems which saves unnecessary efforts as well as heavy resources being used.

4. Reduces latency for end users.

5. Detect Duplicate requests in case of impatient customers.

When to use In-memory caching:

· Small data Size

· Data synchronization not required.

· Data can be loaded each time the server restarts.

· Note: after every server restart data will lost.

When to use Persistent caching:

· When data size is big

· Data should survive upon restarts.

Cache demo:

Let’s understand this by implementing small POC’s.

Cache Scope demo with default caching strategy:

Below is a simple mule flow with non-repeatable stream.

caching in memory configured as non-repeatable/consumable payload

When the request comes in, it will first check whether the payload is consumable/non-repeatable stream.

If the payload is a non-repeatable stream, the caching WILL NOT WORK.

While processing the incoming message, it would clearly tell us in the logs. Please see below logs.

Logs calling out cache cannot be used because payload is non-repeatable

Once the payload check is completed, mule runtime generates key using SHA 256 key generator. Once this is done. It checks whether key presents in cache if not, then it’s a cache miss. Then the actual call to that system happens and before returning the response key gets stored in cache for future requests.

When the key, matches one of the existing keys in the cache store then its cache hit then payload does not enter cache scope and directly cached data will be sent out as response from cache. The cached data will be sent out till the time the TTL of the data is not expired or the data is not invalidated. In those scenarios, the data will be again fetched from the third-party systems.

We will get the logs below when the caching works properly.

Cache Scope Executed Successfully

If we want to use filter tab section when we selected default strategy, then in that case only the condition which we will mention in filter for that only caching will happen for rest of the requests caching won’t happens.

Cache demo with reference to a strategy:

Below is a flow with reference strategy using custom key generator approach.

Flow configured with ref to strategy in memory

Here, in the flow, we are just passing query Parameter role as a mentor and in transform message activity logic is written if role == mentor then print a payload as MuleSoft mentor or else MuleSoft leader.

As the request comes in with query parameter role=mentor then mentor stores as a key and response is sent out by executing whole flow. The very next time when request comes with similar role as mentor then first it checks in cache if the key if sound then cached data gets returned in response.

Have a look what we configured in reference to strategy

query param mentor configured as custom key to store in cache

Will get below logs once caching works

Logs showing Cache Scope executed for Subsequent calls

Cache demo with reference strategy using persistent object store:

Let’s take a same flow with object store configured.

Very first we need to add object store connector so that it will get downloaded in .m2 repo

Configured object store as below:

Custome key generator with Persistent object store

Mark it as persistent to make data survive after restarts.

If you don’t check the check box before persistent data will get saved in volatile/in memory.

The main fields are TTL and expiration Interval here.

TTL is time to live is the expiration time after which cached data will be erased.

Expiration interval is the time at which thread will run to refresh cached data.

Ideally, you should set expiration interval such that it will erase cached data at specific intervals.

Once request comes in, as it’s a first time it will execute everything including cache scope but from next time it will first check if key presents in object store it will not enter into the cache scope and returns response from the cached data.

Will get below logs upon successful processing of caching.

Logs showing Cache scope executed for Subsequent calls

If you observe logs, first it enters cache scope but after that for subsequent requests it returned cached response till the TTL time.

Conclusion:

As you implement caching strategy in your mule projects, remember to balance out the TTL and expiration interval because it is very necessary to purge data at right time also to refresh it on timely basis Which will help us to improve performance and responsiveness of Mule application.

--

--

Manisha Patil
Another Integration Blog

MuleSoft Mentor|| 4xMuleSoftCertified Developer | MCIA, MCD L1,L2, MCI associate Developer || Integration Developer.