Colour your apps in React Native using Material Palette
Successful apps invest in good visual design. That’s essential for a pleasant user experience, where our eyes are delighted with every interaction that occurs inside. Material design meant a big step forward on that matter, providing a visual language that synthesises the classic principles of good design with the innovation and possibility of technology and science. Material as a metaphor, graphically intentional and motion as a way of providing meaning are some of the principles they established to layout beautiful experiences.
With the release of Android 7.0 Nougat, the Palette API was born. It allows you to extract prominent colors from images to help you create visually engaging apps. Mobile apps like WhatsApp leverage it (see the contact details page) and whether you are building an application following material design spec or not, that’s a really powerful feature you can make use of.
Today, I’m excited to announce react-native-material-palette, a library which wraps the Palette API in a nice JS API to use in React Native.
Which colors can I get?
According to the specification, the palette API defines by default six color profiles for a given image, being those:
- Light Vibrant
- Vibrant
- Dark Vibrant
- Light Muted
- Muted
- Dark Muted
Probably that sounds to you like a hieroglyph, like it did for me the first time I saw them. Let’s try to get a better insight by understanding how are they calculated.
The profiles are determined by analyzing the HSL (Hue, Saturation and Luminosity) color profile of the pixels on the image, based on predefined target ranges for luminance, saturation, and population (how many pixels in the image are represented by that particular profile). It uses a weighted average calculation with preference given to luminance, then it comes saturation and finally population.
Generally speaking, vibrant colors are more saturated than muted colors and the light/dark variations of the profiles operate on luminosity.
Enough technicalities, let’s move to the library itself.
The React Native Module
Let me highlight some of the features the library offers in terms of implementation details.
Kotlin
Since Google announced on their keynote the support of Kotlin as first class language for developing Android apps, we wanted to give it a try for implementing the native part. Kotlin is certainly an awesome language to work with, very easy to set up and the runtime library is less than 1KB. Also, as of RN 0.47, react-native link
works for modules developed in Kotlin, so you don’t have to worry about manually setting up project dependencies.
100% test coverage
We have created this library having in mind robustness but also future external contributions. The JS logic is fully tested with jest, so not only we make sure the functionality works as expected, but also external contributions can be speeded up by just committing to the test suite we put into place.
Caching
We use Fresco on the native side to automatically cache images downloaded from the internet, so that we have a good performance in our applications.
Use cases
Our API supports basically two use cases, which are image galleries and styling screen color schemes based on a particular image. A picture paints a thousand words so here you have a visual cue of what you can achieve with the library:
Image gallery
Styling screen color scheme based on an image
Future work
There are still some limitations that we are planning to overcome for the upcoming future:
iOS support
At the moment the library only supports Android. However it’s feasible to extend the functionality to iOS as well, since the algorithm is open source and there’s been already successful attempts to port it to other environments, such as web. If you would like to help out on that, PRs are more than welcome!
Customising your own profiles
The API provides six color targets by default. Sometimes, a given image has no colors fulfilling the criteria of the predefined profiles. It’d be useful to define some custom targets in addition to the existing ones, with different weighting and target lightness and saturation values, in order to increase the chance of finding a useful color. It’s in our roadmap to support that as well.
Wanna try it out?
You can find the whole API in the github repository. We also have an example app available for you so that you can play with it immediately :)
What are you waiting for? Take the brush and enjoy!