Flutter & ML Kit 1.0: AI powered Apps

Syed Muhammad Ahmad
4 min readMar 3, 2024

--

In the modern world, with thousands of new technologies it’s often difficult to make the choice of picking the right stack, the right technology. I have a firm belief, picking the right tech and the right strategy to achieve your goals is both a technical decision and a business decision. As mobile developers, when it comes to machine learning, we have various choices, from using models deployed over the cloud to powerful native mobile SDKs. One such SDK is Google’s ML Kit, used for

  1. On device Machine Learning
  2. Using custom models
  3. Building AI powered cross platform (Android, iOS) apps

In this article, we will cover the following topics

  • Setting up ML Kit for Flutter
  • Reading text from images
  • Detecting language
  • Reading barcodes

ML Kit have various APIs for vision and language we will explore a few of them in this article and the rest in some other articles.

Setting up ML Kit

If you want to be paid in the next 5–10 years my suggestion is to hold tight and ✨ expand your knowledge base ✨ because AI is definitely doing it. Here’s a little reminder

Our boy Tom getting replaced by AI

If you don’t wanna end up like Tom you better start following me to read such articles so that you can beat AI. I hope this was motivating enough for you guys to start exploring ML Kit.

Setting up is pretty straight forward, install the google_ml_kit package on pub dev. This is a pretty time taking process as the whole SDK is being set up. You can go grab a coffee or touch grass because if you have read the article this far I am sure you don’t see grass too often.
The package we installed is a group of all the packages that are required for all the APIs in case you are tackling a specific use case and need a particular functionality you can install the package that’s only required for that API.

Once the package is installed run the app and wait for the SDK to get installed, while we are waiting, let’s take a look at how actually does ML Kit work.

ML Kit was developed for native platforms, the package uses platform channels to communicate with the native code. Since it is native, no processing is done in dart. All calls are passed to native platforms, the package is a bridge between dart code and native ML Kit APIs. Further info is provided in the package docs.

Reading Text from Images

So we won’t talk much, let’s just code. We can’t provide a simple image we will have to provide an object of type InputImage, let’s take a look.

class AppProvider extends ChangeNotifier {
XFile? image;
bool picked = false;
InputImage? inputImage;

void pickImage() async {
final imagePicker = ImagePicker();
image = await imagePicker.pickImage(source: ImageSource.gallery);
if (image == null) return;
picked = true;
inputImage = InputImage.fromFilePath(image!.path);
notifyListeners();
}
}

Once the image is picked we can pass it as an argument to our next layer.

  static Future<String> recognizeText(InputImage inputImage) async {
try {
final recognisedText = await textRecognizer.processImage(inputImage);
return recognisedText.text;
} catch (e) {
debugPrint(e.toString());
throw Exception('Error recognizing text');
}
}

Yup, that’s it. We have read the text from an image.

We can display this text on the UI.

✨ Detected text ✨

Detecting Language

Now let’s take a look at another use case, let’s detect language from a script. Warning, it’s fairly simple.

Let’s define our language codes.

  final _ids = {
'en': 'English',
'zh': 'Chinese',
'ar': 'Arabic',
'nl': 'Dutch',
'de': 'German',
'it': 'Italian',
'pt': 'Portuguese',
'ru': 'Russian',
'ja': 'Japanese',
'ko': 'Korean',
'es': 'Spanish',
'fr': 'French',
'tr': 'Turkish',
'pl': 'Polish',
'sv': 'Swedish',
'da': 'Danish',
'fi': 'Finnish',
'no': 'Norwegian',
'el': 'Greek',
'he': 'Hebrew',
'id': 'Indonesian',
'ms': 'Malay',
'th': 'Thai',
'hi': 'Hindi',
'hu': 'Hungarian',
'cs': 'Czech',
'sk': 'Slovak',
'uk': 'Ukrainian',
'vi': 'Vietnamese',
'ro': 'Romanian',
'hr': 'Croatian',
'ca': 'Catalan',
'sr': 'Serbian',
'sl': 'Slovenian',
'bg': 'Bulgarian',
'lt': 'Lithuanian',
'lv': 'Latvian',
'et': 'Estonian',
};

Now time to detect the language.

  Future<String> getLanguage(String text) async {
final languageIdentifier = LanguageIdentifier(confidenceThreshold: 0.5);
final String response = await languageIdentifier.identifyLanguage(text);
return _ids[response] ?? 'Unknown';
}

That’s it, we are done.

✨ Detected Language ✨

Reading Barcodes

The steps are the same, get the input image, pass it to the function, display the received text!

class BarcodeRecognition {
static final List<BarcodeFormat> _formats = [BarcodeFormat.all];
static final _barcodeScanner = BarcodeScanner(formats: _formats);

Future<String?> scanBarcode(InputImage image) async {
try {
final List<Barcode> barcodes = await _barcodeScanner.processImage(image);

for (Barcode barcode in barcodes) {
final String? displayValue = barcode.displayValue;

return displayValue;
}
} catch (e) {
debugPrint(e.toString());
throw Exception('Error recognizing barcode');
}
return null;
}
}

That’s it! We are done.

✨ Barcode detected ✨

So it’s that simple to use Google ML Kit with Flutter, all we have to do is to call a few functions and we are done.

To view such projects follow me on GitHub and connect with me on LinkedIn.

Till next time!

--

--