Edge computing is one of those things where you have the nails and are still looking for a hammer. In an earlier post, I wrote about Why Machine Learning on the Edge is critical. Pete Warden has also shared interesting insights in Why The Future of Machine Learning is Tiny. There will be many exciting technologies coming out to accelerate the development in this space. Today, we are going to look at how to deploy a neural network (NN) on a microcontroller (MCU) with uTensor.

Checkout An End-to-End Tutorial Running Convolution Neural Network on MCU with uTensor by Dboy Liao

Image for post
Image for post
(📷: Azmi Semih OKAY on Unsplash)


Image for post
Image for post
“A view from the orbit on an artificial satellite over white clouds on the ocean” by NASA on Unsplash

Software engineering can be fun, especially when working toward a common goal with like-minded people. Ever since we started the uTensor project, a microcontrollers (MCUs) artificial intelligent framework, many have asked us: why bother with edge computing on MCUs? Aren’t the cloud and application processors enough for building IoT systems? Thoughtful questions, indeed. I will attempt to show our motivation for the project here, hopefully, you would find them interesting too.

TL;DR: AI on MCUs enables cheaper, lower power and smaller edge devices. It reduces latency, conserves bandwidth, improve privacy and enables smarter applications.

Neil Tan

Developer Evangelist at Arm, Software and Hardware Hacker, Creator of uTensor, Robotics and Physics Enthusiast

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store