The Deep History of Deep Learning… Part V
Identity Crisis, The AI Spring, Circular References and Predicting The Next Big Thing
At last, we have come to the final part of our history of Deep Learning. The route has been circuitous — what do you expect from a magazine titled Circa Navigate. The history has revolved around a much broader topic than just Deep Learning. It has incorporated Artificial Intelligence, Machine Learning, Analytics, and more. No worries. It is all about to get… much worse.
The second AI Winter wasn’t really so grim. Rather than a winter of our discontent, it was more a winter of our general disinterest. Within a few short years, folks were already predicting a new AI Spring. Only, nearly three decades later, we are still waiting. Looking back, rather than an AI Spring, we got an identity crisis.
Expert Systems had arrived in the 70’s. They were often built on LISP machines (see our last article). In the early 90's, these systems proved so beneficial that they began to be integrated into everything. Only, no longer stand alone, they required a new name. The Rules Engine was born.
Rules Engines played very well with the burgeoning software industry. They began to be integrated into everything. They acquired their own acronyms (BRE) , their own software stages (Business Rules Development), and were quickly forgotten as having any relation to artificial intelligence.
Neural Networks became all the rage in the 90’s, too. They had their own acronymns (ANN). They were a re-emergence of the connectionist systems that had fallen from favor a few decades earlier. They had been modeled to replicate the human mind, but had been hampered by computing power through earlier decades. With the increase in computing power, the possibilities were unlimited… or so incredibly limited that people would still be chasing the dream three decades later.
Complexity Science had its day, as well. It was at full steam by the mid-90s. Organizations like the Santa Fe institute and the Complexity journal rode on the tales of Chaos Theory and ANN to propel this new science forward.
DARPA chose 1996 to propel Network Science forward in the public conscience. It rode the tales of the burgeoning world wide web. Connections were all the rage. Six Degrees of Kevin Bacon (1994) punctuated the popularity and recognition of our ever more connected universe.
Next on the scene was Deep Blue. Where Samuel had focused on Checkers, Deep Blue would take on chess. It had been a decade in the making, but in 1996 it took down Garry Kasparov. It was technically a rematch , the two had originally met in 1989. Deep Blue was then Deep Thought. Deep Thought had not fared well, an outcome likely predicted by its namesake. But now, Deep Blue was the talk of the town and much later the target of many claims of the less than honorable sort.
Whether Deep Blue ‘cheated’ or Samuel’s player beat only a ‘self-proclaimed’ checkers expert, it really doesn’t matter. What mattered more was the arrival of Watson. Samuel’s player arrived in the late 50’s, Deep Blue in the late 80’s, and Watson, right on schedule, arrived in 2011.
This cycle was repeating everywhere. Iteration after iteration. Solutions rose and fell. Each built on the last. Each disappearing just long enough to seem like a brand new thing, with brand new hype.
In the late 90’s, Distributed Computing arrived. Rather than calculate Bacon numbers, folks became enthralled with looking for aliens. Seti@home is still looking.
Distributed Computing is like a poor man’s neural network, at least in a physical sense. The new ease of leveraging this sort of processing power gave new life to the dusty disciplines of machine learning and neural networks. As might be expected, that cycle has recurred again recently with the advent of Cloud Computing. Cloud Computing’s hype peaked in 2011.
The language of machine learning continued to evolve as well. LISP was edged out by C++, that gave way to Python which was augmented in 2011 by Tensor Flow. Tensor Flow was a product of the Google Brain team, who may or may not have bet against Watson in that Jeopardy competition. Tensor Flow — ‘the language everyone should learn’ — arrived in 2015. And so had Deep Learning…
Yes, it had been coined decades earlier. It was present years before. But Deep Learning’s time in the sun only came with the intersection of new theory, new techniques, new languages, new platforms, and (of course) new branding. Its hype is growing and if the past rhymes it will continue to do so well into the next decade.
One Final Thought
Is Deep Learning the next big thing? My answer is a definitive (yet qualified) YES! Now let me lay out that qualification. Every revolutionary technology spends decades being over-hyped, over-invested, and over-nuanced. The wheel, the fire, the sail, the steam engine, the rail, the radio, the TV, the airplane, the web, the insert over-hyped thing here — “will change the world as we know it”. In fact, they all did.
That said. None of these lived up to their hype. I am guessing on a few, but hype always out paces even the greatest technological breakthroughs. It is the nature of hype, but it is also the nature of application. Nothing is as simple as it first appears. This history series being a fine example. There are just too many dependencies. Too many circular references.
Also few of these feel so amazing any longer, even the web. Remember the New Economy… nonsense. Or maybe not? No, a decade ago it surely was. Though now it is starting to re-emerge (where have we seen that before?). Only now it is integrated. The web is a channel, as was the plane, the rail, and soon the Space-X rocket. So was the radio, the TV, email and letters. Fire, electricity, microwaves, etc — tools and resources. But trust me — everyone needed a microwave because soon the oven was going to be a thing of the past or we were all going to die of cancer… actually both.
So I do believe that Deep Learning will be looked on as one of these break through inventions in the centuries to come. It will be a tool or resource that you will find everywhere you turn. It will make fortunes and break plenty more. It will also be debated. Was it deep learning, machine learning, neural networks, or artificial intelligence? That debate and why I believe this is such a sure bet will need to wait for a later article…
Thanks for reading!