For idealists, an endeavor is only as worthy as its best possible outcome. Conceivably because of this, artificial intelligence has been one of the most well-established and competitive fields of computer science. Having some insight into the field of AI — both academically and as an industry — is beneficial for being able to appreciate the infancy of the field of blockchain oracles and to foresee what is to come, because the ambitions of our own field are not any less.
Instead of going through the entire history of AI, I will focus on an emergent pattern that I find especially fascinating: the AI winter. Every few decades, a quantum leap in AI research results in a cycle of hype, abundant industry funding, extreme public attention — likely orchestrated through media coverage by the aforementioned investors — and a dramatic under-delivery. Upon the realization of the discrepancy between what was promised and what was delivered, all the attention, funding, and even the crowds of graduate students abandon the field, leaving it to a group of idealists chipping away towards the next quantum leap.
An increasing number of people are claiming that we’re going into the third AI winter now. Going one step further, I would date the start of the chain of events that produced the third AI winter as early as the original GAN paper. Despite not being particularly novel from a theoretical point of view, this piece of research was special in that it delivered the first of a family of generative models that created convincing visuals. Arguably, it was these interesting pictures that allowed the non-academics to be able to appreciate the progress that the field of AI has made through the early 2010s.
Although GANs have been instrumental in creating the hype, it was the industry that extinguished progress with its tight grip. It did this in two ways:
- Almost all of the researchers that made significant contributions to the recent developments were poached by the companies as a marketing strategy towards investors and potential recruits. Unfortunately, just as picking a flower kills it, their new homes stifled the researchers’ ability to do any more groundbreaking work.
- The industry, having disproportionate access to big data and computational resources, forced the entire field into working on leveraging these above everything else. Having all the prominent researchers on their payroll helped them achieve this, because these are the people who decide on what good science is through the peer review process. Nvidia donating GPUs to practically whoever asks can also be seen as a branch of this effort.
Despite its benign appearance on the surface, this all led us to a point where an especially large model trained an especially long time ending up being the best thing that the field can offer in the year 2020.
Events such as AI winters are preceded with a collective state of mind that can only be defined as hubris. Glorifying what has been accomplished, while underestimating how far one needs to go to achieve the ideals that made the endeavor worthy. What is really interesting is that there is no distinct event that triggers the realization of an AI winter, as the perceived sense of progress is delusional in the first place. Then, the collapse happens when it’s deemed most unlikely.
This is not a sad story though, quite the opposite. It’s simply the cycle of life being ever-present, even at the bleeding edge of our collective intellectual output. The old dies, gives birth to new, and that’s how progress is made.