Handwriting number recognizer with Flutter and Tensorflow (part II)

Sergio Fraile
Flutter Community
Published in
8 min readOct 17, 2019

Great to have you back for the second part of the series! 🥳 Hope you are as excited as I am for what’s coming here.

If you just landed in the post and want to catch up with the first article, you will find it here.

For everybody else, as a quick recap for what we did previously: We got hands on with machine learning in Tensorflow, diving into the basics of understanding whats going on; but most importantly for the purposes of this new post, we ended up with an exported TensorflowLite model. We will need that model for what’s coming in this post.

Disclaimer: In order to make the article more dynamic, I’ll try not to spend to much time in base concepts of Flutter. If someone is looking for an starting point tutorial in Flutter, there are more suited articles in Flutter Community than this one.

I’ll be using Android Studio but feel free to use any text editor or IDE.

As usual, all the code will be available at the end of the tutorial.

What to expect?

In this second article we are going to set up our Flutter project and install the required dependencies for using a Tensorflow. We will import our model into the project and create the basic layout skeleton.

Creating the project

Let’s start creating a Flutter application! I’ll call the project handwritten_number_recognizer but feel free to name it as you prefer.

Once you have the project created, you should be seeing something like this:

The first thing we are going to do is to clean that main.dart file and let it as follows:

import 'package:flutter/material.dart';

void main() => runApp(HandwrittenNumberRecognizerApp());

class HandwrittenNumberRecognizerApp extends StatelessWidget {

@override
Widget build(BuildContext context) {
return MaterialApp(
title: 'Number Recognizer',
theme: ThemeData(
primarySwatch: Colors.blue,
),
home:
);
}
}

You will notice we are letting the home parameter empty at the moment, we will be right back to that.

Creating the scene

Now, beside your lib/main.dart file, create a new file called recognizer_screen.dart and paste the following:

import 'package:flutter/material.dart';

class RecognizerScreen extends StatefulWidget {
RecognizerScreen({Key key, this.title}) : super(key: key);

final String title;

@override
_RecognizerScreen createState() => _RecognizerScreen();
}

class _RecognizerScreen extends State<RecognizerScreen> {
@override
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(
title: Text(widget.title),
),
body: Container(
child: Text('My screen'),
),
);
}
}

This is going to be the screen where we are going to work, so we have the code separated from the main app class.

Do you remember that home parameter that we left empty? Go back o main.dart and import the new file we just created:

import 'package:handwritten_number_recognizer/recognizer_screen.dart';

And fill the value in the home parameter:

home: RecognizerScreen(title: 'Number recognizer',),

If you are wondering why those commas at the end of each “part” of the line, try reformatting the code with dartfmt. If you are in Android Studio with the Dart plugin installed, right click in the editor should show that option. There is also a Dart plugin for Visual Studio Code to run this formatting tool.

At this point, if you run the app, you should be seeing something like this:

Our app’s first screen

Let’s import stuff

There are two things we are going to need to import, the model and the tensorflow library; and for both of them we will need to modify the pubspec.yaml file.

Importing the model

Remember the exported model we got from the previous article? Let’s bring it to our app. For doing so, we are going to create a folder called assets at the project root level, and we will drag and drop the model into this folder. It should look something like this:

We will also create a labels.txt file as you can see in the image above, simply fill this file with digits from 0 to 9 in a new line each, as follows:

0
1
2
3
4
5
6
7
8
9

This is the label that will be associated with the output of our model later. In the case of the digits it is a no brainer, the output node 9 will correspond with the label 9; but imagine that we were trying to classify something else, like types of flowers. Then maybe the output 1 would correspond to a rose. This labels file is the one that will output that correspondence for us.

Our prediction, and we will see this later, is going to return an index, a label and a probability. The label value will be taken from labels.txt in our case.

Not everything is done yet! In Flutter, we need to declare our assets in the pubspec.yaml file that we mentioned earlier. If we open this file and get rid of most of the commented code, we have this:

name: handwritten_number_recognizer
description: A handwritten number recognizer built with Flutter and Tensorflow.

version: 1.0.0+1

environment:
sdk: ">=2.1.0 <3.0.0"

dependencies:
flutter:
sdk: flutter

cupertino_icons: ^0.1.2

dev_dependencies:
flutter_test:
sdk: flutter

flutter:
uses-material-design: true

# To add assets to your application, add an assets section, like this:
# assets:
# - images/a_dot_burr.jpeg
# - images/a_dot_ham.jpeg

This file is written in yaml and therefore the indentation is the key. Watch out no to add any extra tab by accident.

I left the comments related to the assets on purpose. We have two options, we could bring all the assets folder as assets, or select the assets we want to bring. Let’s follow this second option and replace the commented code by this lines:

assets:
- assets/converted_mnist_model.tflite
- assets/labels.txt

Please, bear in mind that assets should be indented at the same level than uses-material-design

Importing Tensorflow Lite

The next thing we need is a Tensorflow lite library, so we can run the inference (prediction) of our model locally in the device. Unfortunately, there’s no official support from Tensorflow for Flutter just yet, but a pretty good library already exists called tflite, you can find a link to it here.

If you open the link and go to the installing tab, it will tell us what do we need to do, basically add it to our dependencies, so we do that in the pubspec.yaml and that part should look like this:

dependencies:
flutter:
sdk: flutter
cupertino_icons: ^0.1.2
tflite: ^1.0.4

Then we only need to get our new packages. If you use the command line, just run:

flutter pub get

However, if you are using the Flutter plugin in Android Studio, there should be a button called Packages get at the top right of your editor, you can press it to get the new packages.

There is one thing left to have tflite ready to use in our app. If you noticed, in the pub.dev page for tflite there is a table content and the first section is Installation. In there they detail a manual step required to get it working with Android and a troubleshooting one for iOS, as you can see in the image bellow.

Tflite installation page

Android

Also you will need (at the moment of writing this article) a sdk min version of 19 for Android. You can change this in android/app/build.gradle (same file than the one you added the aaptOptions above).

minSdkVersion 19

iOS

Tflite requires a minimum deployment version of 9.0 or higher.

Building the scene

Alright, our assets and dependencies are all set up. Now we need a way to use our finger for printing in the screen, which means we should start composing that RecognizerScreen class that we created before. Let’s sketch a quick layout for our page.

App layout

Open again the file we created at the beginning to contain our scene. We called recognizer_screen.dart, and replace the Container of the body property by the following. I will be explaining it in a second.

Container(
child: Column(
mainAxisAlignment: MainAxisAlignment.center,
children: <Widget>[
Expanded(
flex: 1,
child: Container(
padding: EdgeInsets.all(16),
color: Colors.red,
alignment: Alignment.center,
child: Text('Header'),
),
),
Container(
padding: EdgeInsets.all(16),
color: Colors.green,
child: SizedBox(
width: 200,
height: 300,
child: Text('Canvas'),
),
),
Expanded(
flex: 1,
child: Container(
padding: EdgeInsets.all(16),
color: Colors.blue,
alignment: Alignment.center,
child: Text('Footer'),
),
),
],
),
),

If you try to run the app, you should be seeing something like this:

Now, what we have done is to divide our scene in three sections: header, canvas and footer.

Header and footer are both a flex container set to 1, meaning they are going to use, equally between them, all the not fixed space available in the screen, so they should resize nicely with different screen sizes.

If you are not familiar with the concept of flex, this is something that comes from web development and you can find some information about the concept there. However, I find the best documentation to wrap you head around it is the one for ReactNative, and you can find it here. Flex implementation in Flutter follows the same concept but is implemented differently han for web or ReactNative, but the most important is that you understand the concept.

On the other hand, the canvas area is a fixed size. Basically, it is the only part in our application that we don’t want being resized in different screen sizes, our painting area is going to have a fixed size, and at the moment is represented by that SizedBox widget, but we will replace it in the next section.

You made it again!

Congratulations! đź‘Ź Now we have an up running with a layout skeleton. You should be celebrating!

In the next article we are going to start developing the components of our layout, particularly we will focus on being able to draw with our finger in the screen.

As usual, you can access all the code from this section here.

Looking forward to see you in the next section! đź‘‹

--

--

Sergio Fraile
Flutter Community

Mobile developer 📱 @ Workday | All things mobile (iOS, Android and Flutter) + ML | GDG Cloud Dublin organizer