Intelligence of Things (IoT 2.0)

Sunil Rao
The Baryon Group
Published in
3 min readMay 4, 2016

As more and more devices are connected each day, our reliance on the cloud increases as well. The Internet of things ushered in several fascinating devices addressing millions of use cases. Most of these devices leverage their connection to the cloud in order to communicate amongst each other, relay critical information back to host systems or retrieve content to serve requests.

The Amazon Echo is a great example of a multi-purpose connected device. This little gadget will allow you to make any request via voice command, then attempts to satisfy the request through its connection to the Internet, specifically Amazon servers. Amazon even provides a marketplace where developers can create webhooks to any endpoint, enabling users to make any request of any service available via simple voice commands.

Siri is yet another example of a popular connected assistant. From retrieving our weather in the morning and setting alarms at night, Siri does a great job of handling small tasks that we otherwise would have to navigate our phone to accomplish.

However, these two well-known voice-powered assistants have a fundamental drawback. They do not work when not connected to the Internet. Now, of course, if the request made by the user requires the Internet this is a dealbreaker. But what if we did not always need to retrieve information from the Internet? How can we make these devices less dependent on the cloud, but enable them to make decisions locally?

Think of a future where Siri could use the forward-facing camera on your phone to identify you through video, and simultaneously recognize who you are while you tell her to unlock your phone. This wouldn’t be possible today because she requires a connection to Apple servers in order to run the machine learning models that help Siri process the natural language that you speak to her.

I read an interesting article this week about a company called Movidius. They are working on a chip that allows you to run trained Deep Neural Nets at low-power on any machine using a dedicated VPU or visual processing unit.

Movidius is pleased to introduce the new Fathom machine learning software framework. Fathom converts trained offline neural networks into embedded neural networks running on the ultra-low power Myriad 2 VPU. By targeting Myriad 2, Fathom makes it easy to profile, tune and optimize your standard TensorFlow or Caffe neural network. Fathom allows your network to run in embedded environments such as smart cameras, drones, virtual reality headsets and robots. Fathom takes Deep Neural Networks to where they have never gone before, at high speeds and ultra-low power at the network edge.

For those of you familiar with Google’s Project Tango tablets, you’ll recognize Movidius since they’re the ones that provide the spatial recognition chip.

Qualcomm has also announced a new deep learning SDK for its machine learning platform “Zeroth”. Here is an excerpt from Qualcomm:

Accelerated runtime for on-device execution of convolutional and recurrent neural networks on the Snapdragon 820 cores (Qualcomm Kryo CPU, Qualcomm Adreno GPU, Qualcomm Hexagon DSP); Support for common deep learning model frameworks, including Caffe and CudaConvNet; A lightweight, flexible platform designed to utilize Snapdragon heterogeneous cores to deliver optimal performance and power consumption; Supports companies in a broad range of industries, including healthcare, automotive, security, and imaging, to run their own proprietary trained neural network models on portable devices.

With these two leading the way to run trained neural nets locally, our connected devices will no longer have to resort to making calls to the cloud in order to make intelligent decisions. Autonomous drones and intelligent dash cams are only the beginning.

I recently binge-watched the cinematic cutscenes for the Halo game series on YouTube. After reading about the developments in this space, I couldn’t help but draw parallels to how master chief plugs Cortana into any device or into the back of his helmet, having her transfer with the chip. I don’t imagine this is too far off from our future. The evolution of this technology coupled with augmented reality should make for very interesting times ahead.

--

--

Sunil Rao
The Baryon Group

Lead CPG go to market @Salesforce, Ex-@SAP Solution Architect and @UWaterloo Physics Alumni