Real-Time Custom Object Detection using TensorFlow App
Some basics of object detection using deep learning methods like You Only Look Once(YOLO) and Single Shot Multibox Detector(SSD) will help you get through the architecture. Having the knowledge of Android studio is a plus, but not to worry, I am here to help. True altruism, guys!!!
Object Detection Methods
In the current scenario, YOLO & SSD are mostly used in day to day life and am gonna discuss those in this post. Well, Faster R-CNN and other R-CNN methods are good and they have an edge over these methods in detection and bounding box regression. But, they are slow. Who needs a 7–10 fps object detection method when we can process frames at the rate of 50- 70 fps? We need real-time applications. I could not discuss SSD and YOLO in this post, that’s why I mentioned having a prior knowledge of there mechanism.
Google has released its TensorFlow Apps and they are really easy to implement. You don’t need to be a Java expert or an Android Studio one. I have only worked on Python being a Deep Learning enthusiast. Believe it or not!!! Here is the GitHub repo for TensorFlow:
tensorflow - Computation using data flow graphs for scalable machine learninggithub.com
Clone or download the repo, then install Android Studio first. Download the latest version from the android website depending on your operating system.
Basic Android Studio Setup Tips(People having knowledge of Android Studio can skip this part)
Now, initial setup of Android will hang your machine if you have laptop of 8GB RAM. Congratulations, on your SDK environment setup. Nonetheless, moving on install NDK but not lesser than 12b version. It’s kind of needed for apps. So, here’s the glitch. Make sure that you are not on proxy server otherwise, you might get stuck in downloading initial packages needed for setup. I was trapped there due to my college server proxy. Use some data of yours, guys!! It’s not that much. While testing the TensorFlow Apps, make sure that your phone or your testing device is above Android version 5.0(API Level 20).
Go to the directory : /tensorflow/tensorflow/examples. Open the android folder in Android Studio. Now, build the APKs and you are ready to go. Holy Moly, you can do that!!!
So, this basic TensorFlow will install 4 apps on your mobile of size 99MB!! The four apps namely are:
- TF Classifier App.
- TF Detector App.
- TF Stylize App.
- TF Speech Recognition App.
Now, here we are focused on Detector App only. Edit the Android Manifest.xml file and comment out rest of the other apps. You can find that in /android/manifests folder. You guys must be curious what is there that makes it up to 99MB. In Deep Learning apps, you need a protobuf file(.pb) and Text file(.txt) (if you are familiar with Tensorflow). These files keep a check of the graph structure, nodes, weights, hyper parameters, etc. As we are focusing only on Detection, get rid of other apps. Go to the following folder: /android/asset and remove other models like inception_graph.pb, stylize_quantized.pb, conv_actions_frozen.etc
Here, you will have different files. Delete every file except coco_labels_list.txt and ssd_mobilenet_v1_android_export.pb. Subsequently, in the Android Manifest.xml file delete the line of codes for Stylize, Speech Recognition and Classifier App. Now, your app size will be reduced. So, now getting to specific objects list. The architecture used in the default TensorFlow App is SSD Mobilenet. It’s efficient and fast as well as you can see below:
Comparing SSD Mobilenet to YOLO, YOLO has more localization errors. Setting Tiny YOLO against YOLO, file size reduces drastically as it is trained on 20 objects only. Either you can download directly the file or otherwise convert the weights file from DarkNet repository as explained below in this GitHub repo:
darkflow - Translate darknet to tensorflow. Load trained weights, retrain/fine-tune using tensorflow, export constant…github.com
Now, put this in the assets directory to get started. Here, I will edit this YOLO file only not doing anything on the SSD Mobilenet. Go to the following directory:/tensorflow/tensorflow/examples/android/src/org/tensorflow/demo or you can directly search .java in the android folder and open item location for DetectorActivity.java file. I still use the latter way. :-)
Open DetectorActivity.java file and go to line 86:- private static final DetectorMode MODE = DetectorMode.TF_OD_API;
Change it to DetectorMode.YOLO as you can see above the three modes described in the file are YOLO, Multibox and SSD Mobilenet. YOLO mode is set up guys. Now, just the last step to make it to specific object you want to use it for.
Open TensorFlowYoloDetector.java in the same folder as DetectorActivity.java and go to line number 226: if (confidenceInClass > 0.01) and add a condition here the labelled class you want to detect and track. For example, I am working on cars, so I changed it to: if (confidenceInClass > 0.01 && (LABELS[detectedClass]==”car”)). Don’t stress out if the class you are looking for is not present there. You can customize the YOLO by training it to a specific dataset.
For tracking purposes, open build.gradle file in Android Studio go to line number 23 and change nativeBuildSystem to ‘cmake’ from ‘none’. Here, you go!!! Now, you can track and detect the specific objects. If you want to go deeper, have an in-depth look at the TensorFlow ObjectDetection APIModel.java & TensorFlowYoloDetector.java. I think I am able to solve your problem up to your desire. Still, any queries, just post it in the discussion. Otherwise, LAUNCH it up!!!
Please appreciate it by clapping if I was able to serve your purpose. Thanks!!!