Easy caching Android + Kotlin + Flow

Andrei Riik
4 min readFeb 13, 2023

--

There are several cases when a good caching mechanism can help you as a developer and improve the quality of your app. Let me briefly tell you about 2 of them and then about the simple solution.

Case 1.

Let’s imagine you are developing an android application with a few screens and several API-calls. As time passes there are more screens and components with background logic. So it’s pretty common when after a while we have 2 or more API GET-like calls happen in the application at the nearly same time. And sometimes it happens in sensitive places like application launch.

What does it mean for user? Obviously it’s extra loading time and battery usage.

Case 2.

You are implementing a feature that requires network or database requests to display some information. May be it’s a profile data or history of operations. And you are thinking on how to handle possible loading errors. Usually you’ll show some error message. May be you’ll even show a retry button. But what if user has issues with network connection? Sometimes it’s better to display maybe a bit outdated information (like in grayish color) at the same time with previously mentioned options.

Case N.

I believe you can imagine more cases when easy-to-use cache could improve your app and save your efforts.

Universal Cache.

So let me introduce the way I’m happy with after some iterations. There are some core principles I wanted to achieve.

Easy to use.

Just wrap your data flow with a few lines of code and use it as usual or in an advanced way. It will cache results automatically into memory or anywhere you customize.

val cachedSource = CachedSource<String, Any>(
source = { params -> api.getSomething(params) }
)

Also there are more properties to tune:

class CachedSource<P : Any, T : Any>(
source: suspend (params: P) -> T,
private val cache: Cache<P, T> = MemoryCache(1),
private val timeProvider: TimeProvider = SystemTimeProvider,
private val dispatcher: CoroutineDispatcher = Dispatchers.IO,
)

And get a result in this way:

cachedSource.get("params", FromCache.IF_HAVE, maxAge = 5_000)
.collect {
// Use received values
}

Ongoing request sharing.

Optionally. Usually we can assume that additional parallel request isn’t needed. But sometimes it’s still required to make a dedicated request without sharing with ongoing one to be sure we get latest information.

Also we need to distinguish requests not only by the source of data but also by exact parameter of the request. So you can share request like “…endpoint1/?param=A” but do not share “…endpoint1/?param=B”

suspend fun get(
...
shareOngoingRequest: Boolean = true,
...
): Flow<T> =

Different request options.

Basically there are several main behaviours that you could want when you load an information:

  • Always get data from the real request. Never get from cache
  • Get from real request but use cache if failed
  • Get from cache but use real request if there is no suitable cache
  • Get only from cache (or error if empty)
  • Get from cache (fast) and then from real request (long)
enum class FromCache {

NEVER,
IF_FAILED,
IF_HAVE,
ONLY,
CACHED_THEN_LOAD,
}

Ability to get cache-related information from the result.

So you always can check whether it’s a data from cache or just loaded one. And how old is it if from cache. Based on this information you can manually switch to new request if needed. Therefore you are able to customize further the resulting flow as you need.

data class CachedSourceResult<T : Any>(
val value: T,
val fromCache: Boolean,
val originTimeStamp: Long?,
)

And so on.

Of course there are some other useful features like the ability to reset the cache or listen (via Flow) any updates. Also you can combine flows in a flatMap way if you want to achieve some non-common behaviour.

For instance, here is an extension to check the cache for custom condition and launch real request if there is no cache or condition isn’t passed:

suspend fun <P : Any, T : Any> CachedSource<P, T>.getOrRequest(
params: P,
fromCachePredicate: (cached: CachedSourceResult<T>) -> Boolean,
): Flow<T> =
flow {
getRaw(params, FromCache.IF_HAVE)
.collect {
if (!it.fromCache || fromCachePredicate(it)) {
emit(it.value)
} else {
emitAll(
getRaw(params, FromCache.NEVER)
.map { result -> result.value }
)
}
}
}

So you can invoke it like this:

source
.getOrRequest("param", fromCachePredicate = { it.someData.isNotEmpty() })
.collect { ... }

How to start using?

The project is on GitHub and artifacts are published on JitPack.
Try it by yourself and leave feedback:

https://github.com/Andrew0000/Universal-Cache

Thanks for reading!

--

--