The key toward edge application — AI infrastructure

DeepThought
DT42
Published in
4 min readAug 25, 2017

With the development of AI in the past five years, people are able to transfer more and more tasks to machines. The AI today can even beat human in Go or in ImageNet competitions. But, what is the mission of the next five years in AI industry? At DT42, we believe the answers to bring AI to more practice applications. The first step we are trying to achieve is to bring AI to edge devices. This first step can help more AI applications to be built and it is going to change the way how machines communicate with human.

Figure 1: What is the next mission in AI? (from phrasee.co)

What does “bring AI to edge devices” mean? The AI today relies on heavy computation and therefore it lives on the cloud most of the time. Cloud is good, but it will face the challenges of high cost, limited bandwidth and privacy concerns. In the era of IoT, cloud computing alone is not enough, and the demand of edge computing is raising [1]. The relationship between the cloud and the edge, however, should be cooperative instead of competitive. Data is analyzed at the edge to provide real-time response, and the filtered data is sent to the cloud for future analysis. There are already some developing applications powered by edge computing, such as self-driving cars, robotics and surveillance industries. Imaging you are going to deploy AI to detect poachers to protect endangered animals in the wild, how many network infrastructure you need to deploy and how much video you need to submit to the cloud if the AI cannot work locally? Isn’t it easier if the AI can live in the local edge machines to detect the poachers and only the detected signals are sent to the cloud?

Figure 2: A long way home… by RemiGardet

DT42 believes that the era of edge computing of AI just begins, and there will be all kinds of innovative AI edge applications built in the coming future; some of them we are even not able to imagine yet. These applications are going to change our life thoroughly. That is why we build AI infrastructure and solutions to help our customers build AI edge applications more easily. In the following section, we will introduce what AI infrastructure means to edge applications.

Figure 3: Edge computing applications (from CBR online)

Let’s first take a look at what components are included in AI edge applications. There are four major components: Sensors, AI infrastructure, Computing platforms and AI core algorithms.

Take applications in surveillance for example, sensors means cameras, computing platforms can be Nvidia TX1, powerful camera or video gateway. AI core algorithms are mainly developed by big players with massive data, such as Google, Facebook or Baidu. So, what does AI infrastructure refer to? Before answering this, we need to first take a look of components required to enable software on hardware devices.

Figure 4: Contemporary AI system stack

That is why Infrastructure is essential to edge applications. The complexity of edge application is much higher than cloud applications, and AI is no exception. Deep learning model alone cannot work as an application. With good AI infrastructure, core algorithms and computing platforms can be bridged together smoothly; this is exactly what DT42 provides — AI infrastructure which converts data to AI applications and enable more possibilities on edge devices. This infrastructure helps users build and deploy AI applications to edge devices easily, and it also handles model updates after the applications are delivered to end users. With the AI infrastructure DT42 builds, machines can now have “brains” without connecting to internet.

Apart from building AI infrastructure, DT42 also builds AI algorithms for surveillance, automotive and robotics customers. Since more than 60% of human activities relies on vision, our first goal is to combine AI models, CV algorithms and AI infrastructure to bring better vision to machines.

[1] http://robtiffany.com/the-cloud-is-dead-long-live-the-edge/

--

--