Running a GPU enabled Azure Custom Vision Docker container on a Nvidia Jetson Nano

Henk Boelman
Global AI Community
5 min readJan 3, 2020

In this article we will go through the steps needed to run computer vision containers created with Microsoft Azure Custom Vision on a GPU enabled Nvidia Jetson Nano in a Docker Container.

Microsoft Azure Custom Vision is an AI service and end-to-end platform for applying computer vision to your specific scenario.

At the end of this walk through you are running your own model on a GPU enabled Nvidia Jetson Nano in a Docker Container.

Nvidia Jetson Nano in a case with extra fan

1 — Setup your device

First you have to setup your device with the lastest OS version and change some settings.

1.1 Install the latest operating system

Install the latest version of the operating system on the Jetson Nano. The Nvidia learn website has a great tutorial for that.
Follow the instructions here. When the device boots and the desktop appears on the screen you can continue with the next step.

1.2 Configure the Jetson Nano

Before we can run the Docker containers created by Azure Custom Vision,we have to change some settings on the Nano.

Connect to the Nano through SSH or open a terminal.

Disable the UI and set the Nano in high-power (10W) mode to speed up the device.

sudo systemctl set-default multi-user.target
sudo nvpmodel -m 0

Set the Nvidia runtime as the default runtime in Docker.
Your /etc/docker/daemon.json file should look like this.

{
“default-runtime”: “nvidia”,
“runtimes”: {
“nvidia”: {
“path”: “nvidia-container-runtime”,
“runtimeArgs”: []
}
}
}

Update your Nano OS and packages to the latest versions

sudo apt-get update
sudo apt-get dist-upgrade

Add current user to docker group to use docker commands without sudo

sudo groupadd docker
sudo usermod -aG docker $USER
newgrp docker

Reboot your device

sudo reboot

1.3 Test your GPU support

docker run -it jitteam/devicequery ./deviceQuery

When the last line states: Result = PASS you can go to step 2, otherwise try to follow the instructions on screen to enable GPU support in Docker.

2 — Train your model and download your container

With Azure Custom Vision you can create computer vision models and export these models to run localy on your machine. One of the export options is to get a Docker container that exposes the model through an API. If you are exporting the container there are 3 types you can choose from, Windows, Linux and Raspberry Pi.

But with a few tweaks to the Linux Dockerfile you can enable Tensorflow GPU and ARM64 support on the nano.

But first let’s create a model and download the docker container. You can create your own model using the tutorials below or download my unmodified container.

Download sample container

This zip files contains a model that can classify Simpson lego figures.
Download

Create your model

Create your classification model using the Microsoft Azure Custom Vision.
- Use Python to create a classification model
- Create classification model through the interface

When you have created your classification model:

  • Go to latest interation
  • Click on export (If export is disabled, make sure you have trained using a ‘compact’ domain)
  • Select Docker
  • Choose for the Linux download
  • Copy the download link (right click on the download button)

3 — Modify and run the Custom Vision container on the Jetson Nano

Now that we have the zip file containing the Dockerfile and model we can download and modify it to run on the Jetson Nano. It is easier to do the steps below through a SSH session.

Download the zip file

wget -O customvision.zip "https://github.com/hnky/blog/raw/master/downloads/HomerOrMarge.DockerFile.Linux.zip"

You can replace the link to the zip file with the link to your container.

Unzip the file downloaded from the Custom Vision Service

unzip customvision.zip -d customvision

Edit the Dockerfile

cd customvision
nano Dockerfile

The contents of the Dockerfile looks like this

FROM python:3.7-slimRUN pip install -U pip
RUN pip install numpy==1.17.3 tensorflow==2.0.0 flask pillow
COPY app /app# By default, we run manual image resizing to maintain parity with CVS webservice prediction results.
# If parity is not required, you can enable faster image resizing by uncommenting the following lines.
# RUN echo "deb http://security.debian.org/debian-security jessie/updates main" >> /etc/apt/sources.list & apt update -y
# RUN apt install -y libglib2.0-bin libsm6 libxext6 libxrender1 libjasper-dev libpng16-16 libopenexr23 libgstreamer1.0-0 libavcodec58 libavformat58 libswscale5 libqtgui4 libqt4-test libqtcore4
# RUN pip install opencv-python
# Expose the port
EXPOSE 80
# Set the working directory
WORKDIR /app
# Run the flask server for the endpoints
CMD python -u app.py

Replace the contents of the Dockerfile with this

FROM nvcr.io/nvidia/l4t-base:r32.2
RUN apt-get update -y
RUN apt-get install python3-pip -y
RUN pip3 install -U pip
RUN DEBIAN_FRONTEND=noninteractive apt-get install libhdf5-serial-dev hdf5-tools libhdf5-dev zlib1g-dev zip libjpeg8-dev -y
RUN DEBIAN_FRONTEND=noninteractive apt-get install python3 python3-dev python-dev build-essential libssl-dev libffi-dev libxml2-dev libxslt1-dev zlib1g-dev -y
RUN DEBIAN_FRONTEND=noninteractive apt-get install -y python3-opencv
RUN pip3 install --pre tensorflow-gpu==2.0.0 --extra-index-url https://developer.download.nvidia.com/compute/redist/jp/v42
RUN pip3 install flask pillow
COPY app /app# Expose the port
EXPOSE 80
# Set the working directory
WORKDIR /app
# Run the flask server for the endpoints
CMD python3 -u app.py

Build, run and test the container

With this configuration in the Dockerfile the container is capable of running TensorFlow with GPU support on ARM64 Jetson Nano.

Build the container (this will take while)

docker build . -t mycustomvision

Run the container

docker run -p 127.0.0.1:80:80 -d mycustomvision

Test the container

curl -X POST http://127.0.0.1/url -d '{ "url": "https://github.com/hnky/blog/raw/master/downloads/marge.jpg" }'

4 — Push the container to an Azure Container Registry

When you are done and happy with your container you can push the container to an Azure Container Registry so you can use it later on.

Follow this tutorial on how you can create an Azure Container Registry
Create an Azure Container Registry

When you have created a registry you need to login on the Jetson Nano

docker login myregistry.azurecr.io

Build the container with the right tag

docker build . -t  myregistry.azurecr.io/mycustomvision

Push the container to the registry

docker push myregistry.azurecr.io/mycustomvision

--

--

Henk Boelman
Global AI Community

Cloud Advocate 🥑 at @azureadvocates | #Microsoft | Former #AI MVP | #Umbraco MVP | #AzureThursday, #GlobalAIBootcamp #GlobalAINight