Android object detection demo on COCO dataset using OpenVINO Toolkit Java API 2.0 with ARM plugin

Bo Lv
OpenVINO-toolkit
Published in
6 min readSep 26, 2022

Introduction

ARM processors have been widely used on Android devices. Intel Distribution of OpenVINO has recently added support for ARM CPUs devices, so it is possible to run neural networks on ARM CPUs Android devices through OpenVINO Toolkit.

This article shows you how to run a demo on ARM Android devices with Intel OpenVINO. We will build OpenVINO with Java API 2.0 wrappers and ARM plugin libraries for ARM Android devices, and run the object detection application to detect 91 classes. The application can run in the emulator on Android Studio or on real devices, reads frames from the camera with an OpenCV module, and get results from object detection networks [like SSD, Pelee, EfficientDet, etc]. Finally, display results with different colored bounding boxes.

Here is the live video about the result.

Now let’s start to run it on your own device!

Steps

Build the OpenVINO libraries for Android

Before we run the demo on ARM Android phones, we need to prepare the libraries for the ARM plugin and java bindings for OpenVINO. These libraries are built from Ubuntu 18.04, and the following steps show how to build the libraries for Java API 2.0.

  • Set java environment
sudo apt-get install -y openjdk-8-jdk
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
  • Create work directory
mkdir openvino_android
export WORK_DIR="$(pwd)/openvino_android"
cd $WORK_DIR
  • Clone OpenVINO and OpenVINO Contrib repositories (Use master branch for Java API 2.0).
git clone --recurse https://github.com/openvinotoolkit/openvino.git "$WORK_DIR/openvino"
git clone --recurse https://github.com/openvinotoolkit/openvino_contrib.git "$WORK_DIR/openvino_contrib"
  • Download Android NDK and set the environment for it. (If you need a proxy, you need to set a specific URL to XXX, or just remove --no_https --proxy=http --proxy_host=XXX --proxy_port=XXX)
mkdir "$WORK_DIR/android-tools"
wget https://dl.google.com/android/repository/commandlinetools-linux-7583922_latest.zip
unzip commandlinetools-linux-7583922_latest.zip
yes | ./cmdline-tools/bin/sdkmanager --sdk_root="$WORK_DIR/android-tools" --licenses --no_https --proxy=http --proxy_host=XXX --proxy_port=XXX
./cmdline-tools/bin/sdkmanager --sdk_root="$WORK_DIR/android-tools" --install "ndk-bundle" --no_https --proxy=http --proxy_host=XXX --proxy_port=XXX
  • Build OpenVINO and ARM plugin for ARM64
mkdir "$WORK_DIR/openvino_build" "$WORK_DIR/openvino_install"
cmake -GNinja \
-DVERBOSE_BUILD=ON \
-DCMAKE_BUILD_TYPE=Release \
-DCMAKE_TOOLCHAIN_FILE="$WORK_DIR/android-tools/ndk-bundle/build/cmake/android.toolchain.cmake" \
-DANDROID_ABI=arm64-v8a \
-DANDROID_STL=c++_shared \
-DANDROID_PLATFORM=29 \
-DENABLE_SAMPLES=ON \
-DENABLE_INTEL_MYRIAD=OFF \
-DENABLE_INTEL_MYRIAD_COMMON=OFF \
-DBUILD_java_api=ON \
-DTHREADING=SEQ \
-DIE_EXTRA_MODULES="$WORK_DIR/openvino_contrib/modules" \
-DARM_COMPUTE_SCONS_JOBS=$(nproc) \
-DCMAKE_INSTALL_PREFIX="$WORK_DIR/openvino_install" \
-B "$WORK_DIR/openvino_build" -S "$WORK_DIR/openvino"
ninja -C "$WORK_DIR/openvino_build"
ninja -C "$WORK_DIR/openvino_build" install

The built results are in $WORK_DIR/openvino_install/runtime/lib/aarch64. We will use them later.

Please confirm that your plugins.xml in $WORK_DIR/openvino_install/runtime/lib/aarch64 contains plugin name "CPU".

  • Build Java API 2.0 for Android
source $WORK_DIR/openvino_install/setupvars.sh
cd $WORK_DIR/openvino_contrib/modules/java_api
gradle build
  • Download and convert model “ssdlite_mobilenet_v2” [or pelee-coco, efficientdet-d0-tf] with Open Model Zoo
git clone --depth 1 https://github.com/openvinotoolkit/open_model_zoo "$WORK_DIR/open_model_zoo"
cd "$WORK_DIR/open_model_zoo/tools/downloader"
python3 -m pip install -r requirements.in
omz_downloader --name ssdlite_mobilenet_v2 --output_dir $WORK_DIR/open_model_zoo/tools/downloader
omz_converter --name ssdlite_mobilenet_v2 --download_dir $WORK_DIR/open_model_zoo/tools/downloader --precision FP16

For the INT8 model, the OpenVINO team has tested on the list of models under ARM plugins which list on appendix.

Import demo project on Android Studio

In this step, we will import the demo project to infer object detection.

  • Choose and download Android Studio on your PC.
  • Clone the latest branch of OpenVINO Contrib.
git clone https://github.com/openvinotoolkit/openvino_contrib.git "$WORK_DIR/demo"
  • Select “File -> Open”, and import the demo project in "$WORK_DIR/demo/openvino_contrib/modules/arm_plugin/demos/coco_detection_android_demo".
Figure.1. Open Project
  • Copy libraries and model files to the corresponding folder.
  1. Clone "$WORK_DIR/openvino_contrib/modules/java_api/build/libs/java_api.jar" to app/libs folder, and add it as the library.
  2. Clone "$WORK_DIR/openvino_install/runtime/lib/aarch64/*.so" and "$WORK_DIR/android-tools/ndk-bundle/sources/cxx-stl/llvm-libc++/libs/arm64-v8a/libc++_shared.so" to "app/src/main/jniLibs/arm64-v8a"
  3. Clone "$WORK_DIR/openvino_install/runtime/lib/aarch64/plugins.xml" to "app/src/main/assets"
  4. Copy "$WORK_DIR/open_model_zoo/tools/downloader/intel/ssdlite_mobilenet_v2/FP16/ssdlite_mobilenet_v2.xml", "$WORK_DIR/open_model_zoo/tools/downloader/intel/ssdlite_mobilenet_v2/FP16/ssdlite_mobilenet_v2.bin" to "app/src/main/assets"
  • Add OpenCV dependency to project
  1. Download OpenCV SDK for Android and unpack it.
  2. Import OpenCV module: select “File -> New -> ImportModule”, specify a path to unpacked SDK, and set module name to “ocv”.
Figure.2. Import OpenCV dependency
Figure.3. Specify a path to unpacked OpenCV SDK

3. Replace compileSdkVersion 26, targetSdkVersion 26 to compileSdkVersion 32, targetSdkVersion 32 in "$WORK_DIR/coco_detection_android_demo/ocv/build.gradle"

Figure.4. Modify version
  • Start an ARM-based Android Emulator.
  1. Using AVD Manager -> Create Virtual Device, and choose one virtual device.
Figure.5. Choose AVD Manager
Figure.6. Create a virtual device

2. Select a system image with arm64-v8a.

Figure.7. Select ARM image
Figure.8. Verify configuration

3. Set a virtual camera from your laptop.

Figure.9. Set camera config
  • Run it!😊😊

The first time when you run the demo application on your device, your need to grant camera permission. Then run it again.

Testing

Test on SSD, EfficientDet, Pelee model

The current OpenVINO engine does not currently support too many models in INT8 format for reasoning on ARM devices(The supported INT8 model has already been mentioned in the appendix), but will support models in this format in the near future and will achieve better performance.

In the FP32 and FP16 format models, there is no speed difference between FP16 and FP32 precision, the plugin will automatically convert the model to FP32 precision, so it is recommended to use the FP16 model, which will take up less physical memory when deployed. The deployment will take up less physical storage space. It is believed that in later versions, FP16 will also have speedups.

The inference speed of the models in the actual test is as follows.

Figure.10. Infering speed

Test on multi threads

I tested five Threads running multiple identical models, only the profiler’s results are given here. You can see that the Async API still runs well on ARM CPUs Android devices. So this test shows the possibility of using multiple models Async parallel inference. Regarding asynchronous processing, here I use the async API to parallelize preprocessing and inference. If the Android device has multiple cores, it can also be paired with multiple threads to complete the asynchronous computation. When using multiple sample threads, the Profiler tool in android studio shows the following results.

Figure.11. Android Studio Profiler result

Conclusion

In this GSoC 2022 project, porting object detection on Android devices was a challenge for me, and I needed to follow the community to update the project tasks, whereas updating API 2.0 took more time for me, and gained a lot of useful development knowledge. Once again, I am very grateful to my mentor and the community for their guidance, and I will continue to be active in community development after this event.😊

Appendix

Here are the verified INT8 models. The model list will be expanded with the development of the OpenVINO Toolkit.

- open-closed-eye-0001
- quartznet-decoder
- mnasnet-0.5
- face-recognition-mobilefacenet-arcface
- ultra-lightweight-face-detection-slim-320
- Sphereface
- mnasnet-1.0
- spnasnet-100
- face-recognition-resnet50-aws
- fbnetc-100
- fsrcnn-x4
- mobilefacedet-v1-mxnet
- regnetx-3.2gf
- densenet-121
- dpn-68
- densenet-121
- densenet-169
- resnet-v2–50
- densenet-169
- densenet-201
- Sharpen-LensBlur
- open-closed-eye-0001
- quartznet-decoder
- mnasnet-0.5
- face-recognition-mobilefacenet-arcface
- ultra-lightweight-face-detection-slim-320
- Sphereface
- mnasnet-1.0
- spnasnet-100
- face-recognition-resnet50-aws
- fbnetc-100
- fsrcnn-x4
- mobilefacedet-v1-mxnet
- regnetx-3.2gf
- densenet-121
- dpn-68
- densenet-121
- densenet-169
- resnet-v2–50
- densenet-169
- densenet-201
- human-pose-estimation-0001
- densenet-201
- yolo-v2-tiny-ava-sparse-60–0001
- yolo-v2-tiny-ava-0001
- yolo-v2-tiny-ava-sparse-30–0001
- densenet-161
- yolo_v3_tiny
- yolo-tiny-v3
- yolo-v2-tiny-vehicle-detection-0001
- tiny_yolo_v2
- tiny_yolo_v1
- human-pose-estimation-3d-0001
- densenet-161
- quartznet-encoder
- quartznet-15x5-en
- resnet-v2–101
- resnet-v2–152
- dna_r9.4.1_2d
- wdsr-small-x4
- single-image-super-resolution-1032
- topaz_video_super_resolution
- single-image-super-resolution-1033
- yolo_v2
- edsr3_super_resolution
- text-detection-0004
- Sharpen-Sharpen
- Denoise
- text-detection-0003
- yolof
- yolo_v4
- Sharpen-LensBlur

--

--