Client side caching strategy when requests are happening faster than they are cached
Let me begin at the beginning. Recently I was in a scenario where a list of lets just call them pins for the sake of something familiar were being returned by an api call made by the Single page application(SPA) that I work on. Due to the way the data works for this particular SPA, you can end up requesting the same data quite a few times, say within each pin it could be a different view of some data but the same request would be made anyway to get the raw data for the visualization required of that card.
In order to speed up the page, I started by exploring some client-side caching strategies. There are a ton of good resources out there on this topic, so good google it. The reason I felt compelled to write this article is that there is an underlying assumption sometimes in these articles that the request that was cached will be finished before another request happens for the same data. 😦
The main goals for what I needed were:
- Stay away from wrapping all the api calls at the fetch level of the app, I wanted this to be a utility to be used whenever some random caching is needed.
- The api should be promise based, so that you can keep the same flow in place when refactoring or writing it and only caching some data
- if a request comes in with the same cache key specified, and one has already come in, the request should not be made for the second call, and should receive the data once the first one is complete.
So let’s say you had this:
this is the result of getting the pins for a board or whatever you want to call it.
This implementation will request the same data to get the additional details of the article it needs to render.
I published a module just recently on npm:
This module helps turn the code above into:
Not a lot more than the first one, and you get the benefit of caching. The api is basically a cache key that you decide on, i’m using prefixes here to just have it be more readable for my forgetful self 😃. The second parameter is the requesting function, that the main requirement is that it must return a promise, and resolve the promise with the data you want cached.
This example will only make one single request, even though the requests are fired off in a loop.
Caching is always a tough problem to solve, and this may not really fit many scenarios. But for me, it seemed useful enough to be promise based, and to have the ability to fulfill the promises later on if they come in before the first key is finished and cached.
Happy Caching!
