Why Data Caching?
When creating an app that has a network layer, it’s important to plan out a caching algorithm that works best for your app. Caching helps improve the performance of apps since it’s faster to fetch local data than remote data.
Caching is also important for offline mode. With a good caching strategy in place, the user will be able to use the app in offline mode without noticing any difference compared to when it’s online.
Caching can also be used to store other external sources of data, such as user input (adding to favourites), the device’s last known location, and data from the device’s sensors. For ease of explanation, I will focus on caching for network requests.
When planning a caching strategy it’s important to understand your app communication diagrams.
Lazy Caching Strategy:
This is one of the easiest algorithms to use when caching data. When the user makes a service request we first check if the response is cached, if so we return the data to the UI. If the response is not cached we request it from the network, store the response in cache, and then return the data to the UI.
The problem with this strategy is that it will always return the cached data, if available, to the UI. What if the data updates in the backend? The user will never see the updated data if it was already previously cached.
Synchronized Caching Strategy:
This strategy aims to solve the problem identified in the Lazy Caching Strategy. When the user makes a service request, right off the bat, we cache the response, if it’s not already cached, and return the data to the UI. If the response is cached, then we check if there is an update available. If there is an update we make a service request, cache the response, and return the data to the UI. If there is no update available we return the cached data to the UI.
The problem with this strategy is that it will only update the cached data if there is an update available. What if we need to expire the cached data in N minutes so that the next time a service request is made it will fetch from the network and update the cache.
E-Synchronized Caching Strategy:
This strategy aims to solve the problem identified in the Synchronized Caching Strategy. When the user makes a service request, right off the bat, we cache the response with a timestamp of when the data was cached, if it’s not already cached, and return the data to the UI. If the response is cached, then we check if the cached data is expired using the timestamp. If it is expired we make a service request, cache the response with a new timestamp, and return the data to the UI. If the cache is not expired, we return the cached data to the UI.
Live Caching Strategy:
This strategy is recommended when you need to always display up-to-date data to the user. Imagine a logistics app for an eBusiness company. This app will need to always fetch data from the network, if online, to ensure that the app is in sync with the backend and has all of the latest trips cached locally. This will allow the driver to continue with his current, next, and subsequent trips when the app is offline.
When the user makes a service request, we must always cache the response to keep the cache up-to-date. When the app is offline we can then fetch data from the cache and return it to the UI.
Asynchronous Offline Caching Service:
The caching strategies discussed so far work great for apps that have one-way communication (GET) with a backend server, but what about apps that have two-way communication (GET and POST) with a backend server. How would it work in offline mode?
When in offline mode we can cache all POST requests made by the user. We can cache the URL as the key and the request body as the value. We can run a service in the background, every N minutes, which will be responsible for posting all cached POST requests. We can delete all of the successfully posted requests. We need to have a strategy in place for bad cached POST requests that fail, otherwise the app will constantly try to post the same request. To avoid this, we can also cache a post attempt counter and when the counter is more than or equal to N we can delete it.
The caching strategies discussed can be implemented using any storage solution, such as Jetpack Datastore, Realm, GreenDao, Room Database, and many more. In part 2, of the Data Access Strategies series, we will discuss Room Database.