On AIaaS, Embedded AI and the AI business models of the future

Nikolay Dimolarov
Mar 21, 2019 · 5 min read

I have been meaning to start writing again for the longest time (think years). And today seemed like the right day to do so; if there is such a thing as right. My first topic is one I have spent a lot of time with in the past 6 months. Namely how to make money with Artifical Intelligence. When I say AI I don’t mean AGI (Artificial General Intelligence — as in Asimov’s iRobot stuff), I mean plain ol’ Deep Learning (around since the 80s — thanks Geoffrey Hinton) and Machine Learning in general.

Currently, the money seems to revolve around the Big Five’s API business models, which could be dubbed together as AIaaS and large consulting companies selling you 6–7–8 figure PowerPoints telling you how you are about to miss the AI train. Let us focus on the Big Five — we all know how PowerPoint works.

First, a look at Azure and how Microsoft could potentially make a million dollars with chatbots. The tech of choice within their Azure Cognitive Service offering is called LUIS (Language Understanding). Should not be too hard for a company with that many Fortune 500 customers, right?

Here is the official pricing for LUIS for the region “West US”:

  1. Free: 0 $ for 10.000 transactions with 5 TPS*
  2. Standard: 1.50 $ / 1000 transactions (=API calls) with 50 TPS*

*TPS = transactions per second

A typical chatbot conversation — one of the more prominent “AI” use cases nowadays — will have around 20 messages if you have a pretty decent dialog meaning 20 transactions per user where one intent is just a mapping of e.g. “What is the weather in Munich?” to a fixed intent/action in our source code to check the weather from a weather API provider and return a response. Meaning you can have 500 chatbot conversations for free; actually way more considering a realistic drop rate in the dialog. Let us assume 700 and let us also assume you have a small website with the need for a chatbot — you basically get away for free with 700 conversations per month. The situation is similar for pre-trained computer vision models for object detection.

Why would Microsoft do this? You know how much you have been paying for MS Office since the 90s, so what is happening here? Hard to say. Somewhere between nobody will pay for this anyway and let us hook as many people on the API train as possible and eventually we would convert them to paying customers. In the meantime a simple calculation:

1.000.000 $ / 1,50 $ * 1000 T = 666.666.666 T (ignoring all SF)

This is the amount of transactions you would need to make a million dollars with this cognitive service. The irony of “666” is not lost on me :)

Going back to the 20T/conversation:

666.666.666 T / 20 T = 33.333.333 chatbot conversations (ignoring all SF)

Ok. So, basically, we need to have 33 million good chatbot conversations to make a million dollars. Now imagine being an AI startup trying to make your name with your newly minted neural network that can recognize intents 10x better than Microsoft. If you want to compete for your first million in revenue you should find money to handle 666 million API requests (and some DevOps engineers and ironically some budget for AWS or Azure) and potentially spend hundreds of thousands of dollars on marketing to pry away big customers willing to pay for Azure’s paid tier.

And the picture becomes clear — it’s next to impossible to compete with the Big Five on AI if we are talking AIaaS. And parts of your revenue would always go back to hosting fees to the same companies.

Enter “Embedded AI” or to make the use case larger on-premise AI in general. This is the only alternative that I can think of to compete with the Big Five when it comes to AI — assuming we are not talking hyper-niche markets where you train a neural network with expert knowledge in a domain like health care.

The promise is to pitch to customers that they can run most neural networks on embedded devices or at least on-premise in general vs. in the cloud where you run into compliance issues, GDPR issues, IT security issues etc. But how do you make money if you cannot meter the users’ usage? You are promising them privacy first and on-premise after all. Good question and no simple answer.

Multiple ways of going about this:

  • License tech based on current usage and sign a contract based on those numbers & revisit the contract each month/quarter/year. Easy enough to manage but you are not very flexible regarding pricing and billing would be a hassle for both your startup and your customers.
  • Let your neural network send API calls to monitor usage. Cool, but you also implied that you can use the NN offline. Remember? It is on-premise afterall. How do you collect your money from people actually using your product on the North Pole or some shipping yard in rural China? Tough.
  • Sell the NN for a fixed fee and lock it with a license. Not sure how to approach this from a technical standpoint but probably doable. But what if your customer could have brought you 10 million dollars in recurrent revenue if you had only gone for Microsoft’s business model. What would your investors say? They do love recurrent revenue. Trust me on this one.

Wrapping up with a small summary:

  1. AIaaS is a numbers game for the Big Five because the running costs would probably be too high to compete with their pricing and the fact that they themselves pay for their own infrastructure running on metal (physical servers) at cost. Unless we are talking small niche use cases with expert domain knowledge where you nevertheless have to fight for survival since it will be difficult to reach volume and your use case has to be special enough, so that it cannot be covered by their offerings. Not easy.
  2. In Embedded AI you have the technological advantage since you have 0 issues with GDPR, IT security etc. and your business model is not bound to recurrent revenue and cloud computing bills — big clients would potentially love this. But how do you bill them? Would they really move to your product if they can just pay for lawyers to write good ToS and just stay with Microsoft’s easy to use APIs? Only time will tell.

Sadly, I have no easy answers. How would you make money with AI?

Just used Microsoft as an example as I have been using their Azure services extensively with large corporates. Feel free to insert IBM or AWS instead and the message will be the same.


We are an R&D company that is specialized in the newest technologies.

Nikolay Dimolarov

Written by

I solve problems with software. I am a PM for enterprise software products interested in mobile & web development, DL and NLP. http://www.dimolarov.com/



We are an R&D company that is specialized in the newest technologies.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade