Deep Learning with DNN Compiler


Rohit Sharma
Nov 22 · 2 min read
DNN Compiler (

What’s DNN Compiler?

Deep Neural Network Compiler (DNNC) is an AOT Compiler and inference framework. Part-1 of this article series showed how to use DNNC as an inference framework. This article shows how to use DNNC to compile a machine learning model on a microcontroller or microcomputer worth ranging from 10¢ to $5.

And what is a deep learning model?

A deep learning model is a collection of programs, trained parameters (weights and biases), and associated computer elements. It could be stored as a python program or stored in a format like onnx and TensorFlow protobuf. Formats like onnx and TensorFlow protobuf are simple file formats that can not be executed on any computer.

How to run it?

DNNC accepts model formats like onnx and converts them into an executable that can run on a platform of your choice.

DNNC is designed to work on small form factor devices like microcontrollers, CPUs, and other embedded devices like raspberry-pi, android, Arduino, SparkFun Edge, RISC-V, mobile phones, x86 and arm laptops among others. It supports many compute architectures, including x86, x86_64, arm, arm64, armv7, amd64, ppc64le, and others. It can run on hardware without OS, and it supports may OS distributions Ubuntu, CentOS, ArchLinux, Manjaro, Sierra (macOS) and windows.

Compilation Steps

Compiling a deep learning model is as easy as 1, 2, 3.

  1. Get onnx model (download, convert or export)
  2. Install dnn compiler (pip install deepC)
  3. Compile (compile-onnx <onnx-model-file>)

Watch these steps in detail in video shown below. The video demonstrates how to run a popular onnx model in an inexpensive $5 device pi zero.

run deep learning models on $5 piZero computer (credit: AITS)

How do I try it out?

One can start using this google colab notebook with no installation or download and install this open-source compiler and framework at to begin contributing.


DNN Compiler potential is infinite since it can be made to work on any of the 30 billion microcontrollers produced each year. More than applicability and market size, it has the potential to change our daily lives in ways not thought before.

By bringing deep learning models to tiny microcontrollers, we can boost the intelligence of billions of devices that we use in our lives, without relying on expensive hardware or reliable internet connections. Imagine smart appliances that can adapt to your daily routine, intelligent industrial sensors that understand the difference between problems and regular operation, and magical toys that can help kids learn in fun and delightful ways.

Towards AI

Towards AI, is the world’s fastest-growing AI community for learning, programming, building and implementing AI.

Rohit Sharma

Written by

🏢 | 🕊@srohit | 🔗 | ❔Quora: | 💻 Github:

Towards AI

Towards AI, is the world’s fastest-growing AI community for learning, programming, building and implementing AI.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade