How to Save $1,000,000 Building AI Edge Solutions

wrannaman
SugarKubes
Published in
5 min readMar 26, 2019
That IOT diagram we’ve seen a billion times.

To see the lessons learned, scroll to the bottom. This brief intro will go through a pricing example of Edge Vs. Cloud and then onto how to save an absurd amount of time for your startup. Also, the amount spent is over $6M but that’s a little harder to believe than $1M 🙈.

The fundamentals of computer vision are more or less solved. Or rather, the basics of computer vision (is there a person here?) are a commodity. Although we’re still working out what to compute where, the business value and cost savings are real.

A Cloud Example

Easy as they are to build on, the computer vision APIs for AWS, Google, et al are still expensive. Let’s break down an example with AWS pricing for a live security camera.

1 Surveillance camera at 2 FPS. 
= 120 frames / minute
= 7,200 frames / hour
= 172,800 frames / day
AWS Recognition pricing is $0.10 per 1,000 images = $172.8 per camera per day :(

The security and surveillance use case is a specific one. But I wonder how many use cases require at least 2 FPS. At roughly what FPS does the cost become reasonable to run purely off of AWS Rekognition?

// Backwards from a reasonable price per camera. $5.00 / month 
= $0.167 / day
= 1000 / .1 = x / 0.167 = 1,670 images per day
= ~70 images / hour
= ~1 image / minute

I wouldn’t exactly call 1 image per minute high fidelity. You can’t even get high enough data resolution for retail analytics at this frame rate. It’s unusable for this kind of application at its current pricing.

An Edge Example

Edge inference makes this particular use case more palatable. As an example, let’s take AWS Greengrass. If you assume local inference is free, you’re paying $0.16 a month for management (amazing) and not paying for cloud inference (amazing). Furthermore, if you upload video directly to Wasabi instead of S3, you don’t take a data transfer hit and can start to come around to some kind of vertical solution that has real margin.

Simple Edge Application Architecture

A Note On Developing Embedded Systems

As a small aside, there are some additional challenges developing for the edge. Developers used to working in the cloud often forget that programming for an embedded device is slightly different than developing for the cloud.

This may or may not have happened to me…🤦🏼‍♂️

You’re cleanup process failed and the disk filled up and you silenced the email alerts because you get 2,000 of them a day. The machine just rebooted and hopefully it will come back up okay meanwhile it’s a premier customer who has a demo in the morning with all the stakeholders, your boss is yelling at you, customers are hurling obscenities at you over slack and you’re having a panic attack at 3AM on a Tuesday…🙊.

Aside from this and 100 other ‘Gotchas’ of embedded programming, some things about edge are easier, but that is perhaps another topic entirely.

The Deciding Factor

So the question now comes down to what kind of hardware do you need to run the application and really what is the core value prop your company delivers. How many cameras can you jam on one machine? How many sensor inputs can you handle at peak?

We’ve all heard of the raspberry pi. I’ll skip that. Below are some interesting new options on the horizon.

Google Coral https://coral.withgoogle.com/ — coming soon! USB, and embedded device. SOC.

Intel Openvino / Movidiushttps://software.intel.com/en-us/openvino-toolkit — Hardware accelerated inference on select CPUs (like newer i-series) or via a USB stick (movidius 2 now available!)

Nvidia Nanohttps://www.nvidia.com/en-us/autonomous-machines/embedded-systems/jetson-nano/ A jetson variant but smaller in both form factor and CUDA cores.

Moving up the chain a little bit, these are beefier x86 machines worth considering though they’re going to cost you $500 and up all in.

Lenovo Tiny — These can come with a Movidius chip on a board (not sure if this is released yet, I got a dev preview so probably landing sometime in 2019), so you can get an awesome x86 CPU with dual Movidius over PCI-E.

Intel NUC — Coupled with Openvino, grabbing an i-series NUC is a great way to do inference at the edge.

TLDR; 🤷‍♀️

Having spent about 4 years in this space doing edge computing before the wave, here’s how to save your startup (sorry, this is startup-specific) a lot of migraines and several million dollars in dev time:

  • Screw air-gapped solutions — Lean on AWS or other cloud-based IOT management software, don’t build this unless you want this to be your business, its a pain in the ass, a snake pit, 🙅🏻‍♀️. If the customer can't call out and hit the cloud ditch them.
  • Don’t build algorithms — We’re based in the US. We’ve lost more devs to GAFA than I’d like to admit mostly because of visa issues, so screw it, just use the open source ones and/or learn how to use Sagemaker or Automl to train your models. You don’t need a Ph.D., nor do you need an MS, nor do you need a BS. Can you code? Cool, then you can figure this out too‍.
  • Raspberri Pi’s kind of suck — Tempting as their price point is, I’ve burned out about 25 Raspberry Pi’s in production. When you’re loading the hell out of these machines in adverse temperatures (running 10 cameras on one Pi doing real time inference in a warehouse that’s over 100 degrees farenheight) they die pretty quick. This is not a production-grade device for uncontrollable customer environments. Either get an industrial case or get an automotive grade board with a ruggedized case.
  • Think about reporting early — Metabase is amazing for a lot of use cases, it’s carried us pretty far, but your customers might need more. Thinking about this early is better. Also, reporting is endless, you don’t want to be in the business of making bespoke reports.
  • Think about search early — We use Elasticsearch / ELK and it’s fine for us. The Elasticsearch API calls are cumbersome though, GraphQL might be better for you.

Liked the article? Sign up for our newsletter. This article was brought to you by SugarKubes. Sugarkubes is a container marketplace. Want to start running AI at the edge? Need some sweet machine learning models that work out of the box? Check us out at https://sugarkubes.io.

--

--