Modern mobile applications are full of images and even though access to higher network speeds and bandwidth is increasing, it’s still vital to decrease load times as much as possible to provide the best user experience.
Newer devices also feature higher screen densities which necessitates higher resolution images resulting in more image data which again leads to higher load times. This can reduce or even cancel out the benefit of improved network speeds.
It is thus imperative to lower the load times to a minimum by e.g. reducing image size/quality to something acceptable for the given application, as well as reducing the amount of image requests. But even with “optimal” image sizes loading still takes time and the user is bound to wait some time before the image is presented, so how can we improve the user experience? We can reduce the perceived load time.
Perceived load time is a subset of perceived performance, which is an important topic for mobile app designers/developers.
This is nothing new. De facto solutions include visual loading states, skeleton views, static placeholder images, color placeholder images; either random colors or colors from metadata (e.g. based on the image that will be loaded) etc.
For more information on perceived performance have a look at Florian Marcu’s article¹ linked below.
How to Improve Perceived Performance in Mobile Apps — instamobile
Mobile apps must be snappy and playful. To fall in love with your mobile app, users need to perceive it as fast…
Focusing on perceived performance of image loading
Many of the placeholder image solutions are still unrelated or at least quite dissimilar to the final image. It is my opinion that the perceived performance is improved if we can present something that resembles the final image while loading.
Back in August 2015 Facebook Engineering posted an article on the technology behind their “preview photos” written by Edward Kandrot². It explains how Facebook’s mobile clients used (and possibly still use) a process of embedding into the network response both the URL of the image together with an “approximation” which can be displayed while loading.
The approximation is a tiny downscaled version (in the order of hundreds of bytes) of the full size image, which can be upscaled and blurred on the client.
This effectively removes one request/response round trip which results in lower load times and better perceived performance². The end result is a temporary placeholder image that transitions quite seamlessly into the final image — even more so using a crossfade animation.
Further reduction in payload size was achieved by extracting a common JPEG header and prepending it on the client before decoding. As this is beyond the scope of this article, please refer to the Facebook post for more details.
Implementing blurred thumbnails using Glide
At Zedge images is a vital part of our applications and the Wallpapers & Ringtones app loads a lot of them. It has always been a high priority to optimize for the user experience and to reduce the perceived load times — especially for images.
Inspired by the post from Facebook we implemented a custom loader for the Glide image library in order to simplify and streamline the process of loading images, as well as benefitting from Glide’s disk and memory caches. Fortunately for us, Glide supports the concept of loading a “thumbnail” which is an image that is displayed until the main image is loaded. The thumbnail can be loaded from network or a custom model — perfect for our use case. This is different from a Glide “placeholder” which can only be a local
The network response
Our applicable network response includes metadata such as an image URL and its respective Base64 encoded tiny JPEG. This is the tiny variant that will be decoded and rendered on device while loading the full size image. The client stores this Base64 string in a data class
TinyThumb which for the purposes of this article can be simplified to:
data class TinyThumb(val base64: String)
In order to load instances of
TinyThumb using Glide we have to provide three key components:
ModelLoader— A factory interface for translating an arbitrarily complex data model into a concrete data type that can be used by a
DataFetcherto obtain the data for a resource represented by the model. In our case this will be from
TinyThumb, but it could e.g. be from a custom
DataFetcher— Lazily retrieves data that can be used to load a resource. This would normally download something over network, but we already have the data encoded as Base64 so this is just boilerplate code.
ResourceDecoder— An interface for decoding resources from one type to another. In our case from
BitmapDrawable. This is where the actual work happens.
In the interest of completeness, but also reducing complexity in this article, we’ll simplify the classes a bit. At the end there will be references to more extensive versions.
TinyThumbDecoder does the heavy lifting. Focus on the
decode function as this is what actually returns the
BitmapDrawable resource to Glide.
Registering our custom model
The last step is to register the classes with the Glide component registry as follows:
And that’s it! 🎉
This enables us to load images using a pattern similar to:
Caveats on performance
It’s important to note that many of the magic numbers are application/use case specific. Please profile both your application and backend in order to make sure that the size of the tiny thumb is ideal. I.e. large enough to function, but small enough to ensure reasonable load times. The last thing you want is to increase total load times. Bitmaps can take up a lot of memory.
The blurring operation can also be taxing and you should tweak both the blur radius and sample size in order to get acceptable results.
Using AirBrush for blurred thumbnails
Since this is a common use case at Zedge I’ve created a library named AirBrush. It’s essentially a small integration library for Glide which we’ve been running in production for quite some time. It is available on Maven Central. Contributions are welcome! 👏🏻
By adding the library dependency
TinyThumb support is automagically added for Glide. There is also a sample app that showcases the effect.
For more details see AirBrush’s GitLab page.
Thank you for reading 🙏
: Florian Marcu. (May 3rd 2021). How to Improve Perceived Performance in Mobile Apps https://engineering.fb.com/2015/08/06/android/the-technology-behind-preview-photos/
: Edward Kandrot. (August 1st 2015). The technology behind preview photos https://engineering.fb.com/2015/08/06/android/the-technology-behind-preview-photos/