The biggest challenge of AI is not people or technology

CD Athuraliya
4 min readMay 2, 2018

--

Yes, that’s correct. Well, I think so and let me explain.

I think the biggest challenge of AI right now is making it an industry or an engineering field. All the other challenges were there for a while if you really think about it. There can be slow downs and booms in fundamental research, talent can be in short supply, we still make progress. But I think we are yet to define AI as an industry. And we need to do it right away!

But why..? We all can see how AI is changing the world; from the photos we take to how we move around. It is becoming more common and ubiquitous every day. From a technical point of view, this means the systems we use in our day-to-day life have much more AI underneath.

AI as a field has existed for more than six decades. But most of its lifetime as a science not as an industry. With the emergence of deep learning and its recent successes, big players (Google et al.) saw the potential and started using AI in their products and services. Now it’s quite a race among them. But do we yet have a field called “AI engineering”? You may say yes because there are few MOOCs and many job openings. Of course it’s good but not enough to define a new field.

If you think about it, software engineering as a field is decades old and it has evolved as an engineering field for a while. It has been taught at schools, hundreds of books have been written, frameworks and paradigms have been introduced. AI needs to reach that maturity soon. But still, why do I think it is the biggest challenge? Let me explain a little bit more.

Software engineering or software development as a field has evolved under the assumption that everything is deterministic. True that your code stops working randomly and infrastructure fails for no reason (we think!). But those are not intended to be like that. We write code and build infrastructure to work in the same way, all the time. But AI, especially more commonly used machine learning or deep learning, is mostly probabilistic. We don’t build things to work in the same way 100% of the time. That’s also the beauty of AI. This might not be new to you if you at least know the basics of machine learning. And nothing should bother you much as long as you do your research implementation or ML hobby project. But think about building production systems that involve the complete software development life cycle with that probabilistic behavior in mind.

We started thinking about all this as soon as we started working with clients. From requirement analysis to system testing, everything changes when you’re working with probabilities underneath. Instead of (ideally) 100% uptime, now we’re talking about 70% of the time in one state and 30% in another. Don’t forget that these are probabilities, 60/40 is also acceptable. In reality, it gets more complicated than this.

“All this is fine but why today? We have some time right?” you may ask. Ideally, it should have been yesterday! As I said at the beginning, we see AI products everywhere but everyone is building AI products in their own ways. We don’t have common frameworks (I’m not talking about tools here) or engineering paradigms for AI product development yet. We can be missing compatibility, interoperability, security, scalability, resilience … you name it. I know what you think, I should be out of my mind to think that they don’t maintain these at Google or Tesla. Yes, they do, but it’s very likely that those are their own standards and frameworks. It is highly unlikely that these different companies follow the same paradigms of AI product development even at a high level as we have been doing in software engineering. And that is why we need to define a new engineering field for AI. We need to talk more, we need to write textbooks, we need to develop courses for schools.

I have some good news too! Things are not as bad as you think. I can remember a few related sessions at SOCML 2017, maybe because the sessions were suggested by attendees. Unfortunately, I had to skip “ML in Production” for something more interesting (!) but (in my defense) I was at “Software Frameworks for Machine Learning”. Hopefully, we’ll be able to see more discussions and events around AI as an engineering field in the coming months and years. I think AI incubated in 2010s and it will reach its maturity in 2020s. But there’s a lot to be done!

I’ll be writing more about our AI experiences at ConscientAI in upcoming posts. I would love to hear back if you think this is useful. Do share your thoughts/feedback below.

--

--

CD Athuraliya

Co-founder @ConscientAI, Research Fellow @LIRNEasia, Think about and work in AI