Most Enterprises Don’t Have a Deep Learning Strategy

Credit: https://unsplash.com/collections/326107/strategy?photo=_pc8aMbI9UQ

Deep Learning is a technology that is as disruptive as mobile computing or the world wide web that came before it. Yet, most enterprises have no strategy on what to do. This is perplexing given that Deep Learning most hyped up slogan is “The Last Invention of Man” [KHAT]. Why is this so?

It boils down to one simple fact: enterprises don’t understand Deep Learning. To make it even worse, they can’t possibly understand the current wave of Artificial Intelligence (AI) developments if they don’t understand Deep Learning. Let’s be perfectly clear, the recent spike interest in AI is due primarily to Deep Learning technology. It is not due to other AI technologies like expert systems, semantic knowledge bases, logic programming or bayesian systems. These other AI technologies have not changed much in the last 5 years. The only quantum leap that we’ve seen in the past 5 years has been Deep Learning.

AI has been talked about since the 1950s. The term carries over half a century of extra baggage. Over the decades AI has become too nebulous and thus in its vagueness, isn’t concrete enough to be used as a driver or goal for an enterprise. It makes better sense to focus on technology that is more concrete and real. Claiming to have an AI strategy without a corresponding Deep Learning strategy is akin to admitting that there is no strategy at all!

To understand how and where your business can apply AI, businesses have to understand what Deep Learning can or cannot do. An effective AI strategy is to recognize the benefits of Deep Learning. It is like any other tool that is used in a business, understand first what the tools are capable of and then planning from there. That, however, may sound simple unfortunately, it is not. Knowledge about the capabilities of Deep Learning is very scarce. There are many Machine Learning and Data Science experts that do not understand this technology. After all, it did not exist when they were learning their trade in graduate school.

Just to give you some perspective as to the extent of Deep Learning developments, look at this graph from Google showing Deep Learning use in their apps:

and this quote from the NY times article “The Great AI Awakening” [KRAU]:

The neural system, on the English-French language pair, showed an improvement over the old system of seven points.

Hughes told Schuster’s team they hadn’t had even half as strong an improvement in their own system in the last four years.

To be sure this wasn’t some fluke in the metric, they also turned to their pool of human contractors to do a side-by-side comparison. The user-perception scores, in which sample sentences were graded from zero to six, showed an average improvement of 0.4 — roughly equivalent to the aggregate gains of the old system over its entire lifetime of development.
In mid-March, Hughes sent his team an email. All projects on the old system were to be suspended immediately.

Let’s recognize what has happened at Google. Google has been since the very beginning been using all kinds of “AI” or machine learning technologies. Their average gains for improvement per year was 0.4. In Google’s first implementation, Deep Learning improvement was 7 points, more improvement that the entire lifetime of improvements. If you are already doing AI, then Deep Learning will accelerate your improvements. If you are not doing AI, then does it make sense to do AI instead of Deep Learning? Deep Learning is real technology and it is taking over operations of the most advanced technology companies in the world. Possibly I should use a different word than real, maybe disruptive is more appropriate.

One reason why knowledge and talent are so very scarce is because of the extremely rapid development of the Deep Learning field. Just to give you an idea, a month ago the ICLR 2017 conference had around 500 papers submitted for consideration. ICLR 2017 is to be held April 24th of 2017. That will be at least months from now and most people will learn about it only then. We already saw this with the NIPS 2016 conference held last month. Almost all the material in that conference was 6 months old, yet lots of folks were broadcasting about the newness of it all. 6 months is a very long time in the Deep Learning community, yet have folks digging up old lectures from 2014 and 2015 and claiming them to be enlightening.

Also, it is important to remember that what the big firms chose to publish are likely information that for them is maybe 6 months to a year old. Just witness Google’s revelation of the Tensor Processing Unit (TPU) ASIC. When it was revealed, Google had been using it for at least a year:

We’ve been running TPUs inside our data centers for more than a year, and have found them to deliver an order of magnitude better-optimized performance per watt for machine learning. This is roughly equivalent to fast-forwarding technology about seven years into the future (three generations of Moore’s Law).

So even if we dutifully read every Arxiv publication on the day it gets published, we are at best maybe 6–12 months behind the giants of innovation!

The key to understanding all these developments is as what Andrew Ng has advised [NG]:

In my own life, I found that whenever I wasn’t sure what to do next, I would go and learn a lot, read a lot, talk to experts. I don’t know how the human brain works but it’s almost magical: when you read enough or talk to enough experts, when you have enough inputs, new ideas start appearing. This seems to happen for a lot of people that I know.

That doesn’t mean however that developments that are 6 months old are completely useless. Google and the other masters of the universe are themselves resource constrained. Furthermore, these companies will stick to their core competencies and will rarely venture out to other areas. This provides the opportunity for the rest of us. There are plenty of opportunities in domains small and narrow enough that the giants will either ignore or be unaware of its existence.

Deep Learning technology also continues to improve at a torrid pace. So if you start leveraging today’s Deep Learning technologies, you will be able to take advantages of the improvements in algorithms a few months later. But just passively waiting for the technology to improve to the point where it exactly addresses your specific business requirements may put your firm seriously running on the outside curve, possibly forever to be left behind. It therefore is more prudent and pragmatic to begin getting your enterprise’s feet wet with the technology and thus maximizing opportunities to leverage future technological improvements.

This is the main motivation as to why we have worked on the Design Patterns for Deep Learning. A consistent conceptual model is absolutely essential in our ability to digest the latest research developments. If you curate enough design patterns, you begin to have an intuition of how it all begins to fit together. This intuition allows you to get ahead of everyone else and predict where the field is heading. That’s why we gave a research focus on the Holographic Principle as well as Modular Deep Learning. These are concepts that you likely will not have heard elsewhere, but these are critical as we make progress.

At Intuition Machine we are working on a Deep Learning Playbook for Enterprise. This playbook provides a starting point for a company to start thinking about a Deep Learning strategy. At a 35,000 foot level, we recommend the following for enterprises:

Manage Expectations

Understand the limitations of the technology. Which problems can be solved and which cannot.

Invest in Data Logistics

Invest and develop a process that treats data as important assets to be managed, leveraged and enhanced.

Acquire Relevant Talent

Bring someone in that understands Deep Learning and data infrastructure.

Attack the Right Problems

Evaluate your own business processes to understand which ones can benefit the most from DL. Select the processes that can have the greatest impact to the business.

Institute a DL Methodology

There are many software development methodologies, however, Deep Learning has its own unique set of capabilities and thus a unique way of doing things. Learn to fuse this new kind of methodology with your existing software development life-cycle.

These are some general guidelines that we hope you can jump start your thinking. The key is to begin preparing your company for the inevitable transition to the AI driven economy. Always best to be prepared.

Have a conversation with Intuition Machine to jump start your strategy.

[KHAT]http://www.newyorker.com/magazine/2015/11/23/doomsday-invention-artificial-intelligence-nick-bostrom

[KRAU]http://www.nytimes.com/2016/12/14/magazine/the-great-ai-awakening.html?_r=0

[NG]http://www.huffingtonpost.com/2015/05/13/andrew-ng_n_7267682.html