Solve a Word Search Game using Firebase ML Kit and Huawei ML Kit

Allow your users the freedom to choose their Android platform providing the same feature

Giovanni Laquidara
Huawei Developers
4 min readMay 28, 2020

--

A Classic Word Search game

Some time ago I developed a Word Search game solver Android application using the services from Firebase ML Kit.

It was an interesting trip discovering the features of a framework that allows the developer to use AI capabilities without knowing all the rocket science behind.

In the specific, I’ve used the Document recognition feature to try to extract text from a word search game image.

After the text recognition phase, the output was cleaned and arranged into a matrix to be processed by the solver algorithm. This algo tried to look for all the words formed by grouping the letters respecting the rules of the games: contiguous letters in all the straight directions (vertical, horizontal, diagonal)

This app ran well on all the Android devices capable to run the Google Firebase SDK and the Google Mobile Services (GMS).

Since the second half of last year all new Huawei devices cannot run the GMS any more due to government restrictions, you can read more about this here:

My app was not capable to run on the brand new Huawei devices :(

So I tried to look for solutions to make this case study app running on the new Huawei terminals.

Let’s follow my journey…

The Discovery of HMS ML Kit

I went throughout the Huawei documentation on

here you can find many SDKs AKA Kits offering a set of smart features to the developers.

I’ve found one offering the features that I was looking for: HMS ML Kit. It is quite similar to the one from Firebase as it allows the developer to use Machine Learning capabilities like Image, Text, Face recognition and so on.

In particular, for my specific use case, I’ve used the text analyzer capable to run locally and taking advantage of the neural processing using NPU hardware.

Integrating HMS ML Kit was super easy. If you want to give it a try It’s just a matter of adding a dependency in your build.gradle file, enabling the service from the AppGallery web dashboard if you want to use the Cloud API and download the agconnect-services.json configuration file and use it in your app.

You can refer to the official guide here for the needed steps:

Architectural Approach

My first desire was to maintain and deploy only one apk so I wanted to integrate both the Firebase ML Kit SDK and the HMS ML Kit one.

I thought about the main feature

Decode the image and getting back the text detected together with the bounding boxes surrounding each character to better display the spotted text to the user.

This was defined by this interface

I’ve also defined my own data classes to have a common output format from both services

Where Document represents the text result returned by the ML Kit services, it contains a list of Symbol (the character recognized) each one with its own char, the bounding box surrounding it (Rect), and the index in the string detected as both MLKit service will group some chars in a string with a unique bounding box.

Then I’ve created an object capable to instantiate the right service depending which service (HMS or GMS) is running on the device

This was pretty much all to make it works.

The ViewModel can use the service provided

by the right recognizer instantiated when the WordSearchAiViewModel is instantiated as well.

Running the app and choosing a word search game image on a Mate 30 Pro (an HMS device) shows this result

Test Word Search game app

The Recognizer Brothers

You can check the code of the two recognizers below. What they are doing is to use the custom SDK implementation to get the result and adapt it to the interface, you can virtually use any other service capable to do the same.

Conclusion

As good Android developers we should develop and deploy our apps in all the platforms our user can reach, love and adopt, without excluding anyone.

We should spend some time trying to give the users the same experience. This is a small sample about it and others will comes in the future.

If you are interested in the algorithm or the architecture of the app You can read all the source code of the complete app on

This article is also published on

You can ask your questions here and in the forum. Happy to answer :)

--

--

Giovanni Laquidara
Huawei Developers

Developer Advocate @ Huawei, Android, ML and VR/AR lover