ML Kit — Language Detection and Translation

Ahmet Yunus Sevim
Huawei Developers
Published in
4 min readJul 8, 2020

Hello everyone. In this article we will learn about HMS ML Kit, language detection service and language translation service. Also we will learn how to enable and integrate these services.

About ML Kit

HMS ML Kit provides a wide range of machine learning services and allows you to easily integrate these services.

Figure 1, ML Kit services.

Language Detection and Text Translation can work both on the device and on cloud. Language Detection works equally in both device and cloud and . Text Translation only supports Chinese and English on the device. If Text Translation works on cloud, it supports 12 different languages(Simplified Chinese, English, French, Arabic, Thai, Spanish, Turkish, Portuguese, Japanese, German, Italian, and Russian). Language Detection service can detect up to 49 languages of text in both online and offline modes.

In this demo project language detector detects the language of text(“tr”). Then translator takes output of detector as parameter and performs translate from that language to English. Text Translation support 12 language but in this example it only translates to English.

Enabling the Services

Figure 2, Data storage location.

First of all “data storage location” must be configured for ML services. Go to Project Setting > General information, click Set next to Data storage location under Project

Then you need to enable ML kit from Project Setting > Manage APIs > ML kit. After that go to Develop > Build > ML Kit and you can view information about the service usage.

Figure 3, Agconnect-services.json file.

Not: Don’t forget to download and add agconnect-services.json file to app folder of project.

Adding Build Dependencies

ML Kit provides two SDK integration modes: full SDK and base SDK.

Base sdk supports only Huawei phones running EMUI 5.0 or later if services used as on device. You can use Base sdk for support all phones if on-cloud capabilities are used.

Use Full sdk if On-device capabilities are used, and devices running Android 4.4 or later are available.

Add following dependencies to gradle file in app folder for Base sdk:

Add following dependencies to gradle file in app folder for Full sdk:

Following table shows needed user permissions (AndroidManifest.xml file)

Figure 4, User Permissions for Text Translation service and Language Detection service

Modify your AndroidManifest.xml and add permission list.

Development

Create xml file

Figure 5, Simple UI Design

Language Detection

We will use Language detection service as on device. Only a few lines of code differ between use on device and on cloud. First we need to create MLLangDetectorFactory and MLLocalLangDetectorSetting objects and create MLLocalLangDetector using setting object.

Now we can create function for language detection. This code returns the code of the language with the highest confidence. Function takes text as parameter to detect and finds code of that language(exp: “tr”,”en”).

If you want to return multiple language detection results with the language codes and confidences, you have to change code as following code:

Also if you want to use language detection as on cloud you have to change code as following code:

Release resources after the detection is complete:

Text Translation

In this part we will add text translation function. In this example it will only translate to English. If you want you can take desired language from user and give it to translate method as parameter as shown at following codes(Marked with comment line).

In this method sourceText is text to translate and sourceLangCode is code that indicates what language sourceText is(“tr”). Same as language detector we have setting object(MLRemoteTranslateSetting) to use for creating MLRemoteTranslator object.

In this example Text Translation service used as on-cloud capability, therefor it supports 12 languages.

Finally, release resources after the translation is complete.

You can find project here.

--

--