Android meets Machine Learning (part 1): From TensorFlow mobile/lite to ML Kit

Qian
Publicis Sapient France
5 min readMay 23, 2018

TensorFlow and Magritte project

About a year and half ago, I attended a talk by Martin Görner: Tensorflow and deep learning — without a PhD. From that point on, the whole machine learning/deep learning buzz seems less obscure and more fun to read about. Together with Yoann and Sylvain, my co-workers from Xebia France, we built a little side project Magritte right after TensorFlow team released their very first Android demo.

It’s a simple app which helps you learn languages by recognising object of daily life, working offline:

(Magritte screens)

We went on talking about it at different conferences during the year of 2017:

The interesting and also the difficult part of building something with constantly evolving technology such as TensorFlow, is that you have to keep updating your content with all the breaking changes.

This is the demo we built right before AndroidMakers 2017 with TensorFlow Mobile SDK, running inferences with a model retrained with Inception-v3 and also quantized, on a Nexus 5X device. You can clearly see the latency which is around 3–4 seconds:

(Magritte demo for AndroidMakers 2017)

Just 1 month later, I was sitting in the front row of one Google I/O talk, where Kazunori Sato & Hak Matsuda told some awesome stories about building Android apps with TensorFlow SDK. That was the best I/O moment for me, along with the Kotlin support and TensorFlow lite announce ;)

In June 2017, Mobilenet was released. We (mostly Yoann) were jumping up and down, psyched about the performance improvements. Here is the same application but running with a model retrained with quantized MobileNet_v1_1.0_224 model, besides the obvious model file size change, the inference runs significantly faster:

(Magritte demo for DevFest Nantes 2017)

We did a comparison on model files' size:

  • Model retrained with Inception-v3: ~80Mb
  • Model retrained with Inception-v3 and quantized:~20Mb
  • Model retrained with mobilenet_v1_224_1.0 quantized: ~5Mb
  • Model retrained with mobilenet_v1_224_0.25 quantized: ~1Mb

We also benchmarked the performance changes in term of CPU and memory consumption, when running the application with models retrained with different configurations:

(Model performance benchmarks)

The future belongs to TensorFlow Lite and ML Kit

Right after I presented these benchmarks at the DroidCon London. The TensorFlow lite was finally released in mid November after the initial announce during the Google I/O 2017. At this point we started to convert our model to TensorFlow lite format and we just kept being amazed by how faster the performance improved.

During this year's Google I/O, you all witnessed Google's efforts on making the integration of machine learning in mobile apps even easier. ML Kit feels like the ultime answer to my obsession of building mobile applications with on-device intelligence, plus it comes with Firebase, are you kidding me? That's sort of dream come true.

With Magritte project, we built our own mini backend API to do the model serving and support model update. Now it's ALL simply available with Firebase!

But there is still long way to go, currently toco (TensorFlow Lite Optimizing Converter) is half broken, here are some latest encounters with the error messages from toco:

And the ML Kit example provided by Google doesn’t feel as fancy as what they showed during the I/O’s demo. But developers are definitely excited by the announce of ML Kit I'm looking forward to catch up with blog posts and demos from the awesome Android community ❤

Un petit hommage

Why I’m obsessed with on-device machine learning? From the year 2013 to 2014 I worked for a French startup called Moodstocks, which later on get acquired by Google to become the core team that works on Google’s mobile vision API. I started my career as an Android developer and learned so much at Moodstocks. I was and still am a firm believer of their initial vision: give eyes to your mobile apps, bridge the virtual and the physical world.

(Moodstocks website in 2013)

Want to get started?

You can find everything about Magritte project on Github, including the updated python scripts that helps you convert your protobuf model file to tflite model file.

If you want to get started on training your own model and building an Android application with TensorFlow Mobile / Lite, the codelab series "TensorFlow for Poets" are definitely the best tutorial to begin with:

At the meantime, I'm working on some ML Kit demostration and hopefully soon the part 2 on Android meets Machine Learning. Ping me on Twitter if you want to have a discussion! Feedbacks is more than welcome :)

We are also organising a meetup at Xebia on the related subject: Mobile Things meetup — when machine learning meets augmented reality. Grab your place if you are in Paris the 20th June!

You can find other articles in French on various subjects at Xebia France's blog! Have a look :)

Find all technical articles by Xebia on blog.xebia.fr

--

--

Qian
Publicis Sapient France

Android developer, machine learning newbie, GDE IoT. Enthusiast on accessibility TechForGood subjects | currently @XebiaFr