The Top 10 Emerging Technologies of 2017

AI and Quantum Computing Reviewed

MIT IDE
MIT Initiative on the Digital Economy
6 min readDec 4, 2017

--

By Irving Wladawsky-Berger

Earlier this year, Scientific American, in collaboration with the World Economic Forum, published a special report on The Top Ten Emerging Technologies of 2017. These technologies — selected by a global panel of experts — “are expected to become increasingly commonplace in the next few years,” and are “attracting increased funding or showing other signs of being ready to move to the next level.”

Credit: World Economic Forum

Here are the 10 technologies comprising the 2017 list:

The full report includes a one-page description of each of these technologies. Let me briefly discuss the two technologies in the IT category: AI and quantum computing.

Artificial Intelligence

After many years of promise and hype, AI has finally been making great progress over the past several years. AI is now being applied to activities that not long ago were viewed as the exclusive domain of humans. Not surprisingly, another AI topic — Open AI Ecosystem: From artificial to contextual intelligence, — was also selected in the 2016 Top Ten Emerging Technologies list, given that “over the past several years, several pieces of emerging technology have linked together in ways that make it easier to build far more powerful, human-like digital assistants…[which] could unlock higher productivity and better health and happiness for millions of people within the next few years.”

This year’s AI topic deals with computer vision and deep learning technologies. “For most of the past 30 years, computer vision technologies have struggled to help humans with visual tasks, even those as mundane as accurately recognizing faces in photographs. Recently, though, breakthroughs in deep learning, an emerging field of artificial intelligence, have finally enabled computers to interpret many kinds of images as successfully as, or better than, people do.”

Deep learning is part of a broad family of machine learning methods which have played a major role in AI’s recent achievements. Machine learning gives computers the ability to learn by ingesting and analyzing large amounts of data instead of being explicitly programmed. It’s enabled the construction of AI algorithms that can be trained with lots and lots of sample inputs, which are subsequently applied to difficult AI problems like computer vision, natural language processing and machine translation.

Machine learning grew out of decades old research on neural networks, a method for having machines learn from data that’s loosely modeled on the way a biological brain — composed of large clusters of highly connected neurons — learns to solve problems. Based on each person’s life experiences, the synaptic connections among pairs of neurons get stronger or weaker.

Similarly, each artificial neural unit in a network is connected to many other such units, and the links can be statistically strengthened or decreased based on the data used to train the system, as opposed to being programmed with fixed rules. As new data is ingested, the system rewires itself based on whatever new patterns it now finds.

Deep learning is based on multilayered neural networks. Deep learning looks for patterns of patterns, with each successive layer looking for patterns in the previous layer. Such multilayered networks are extraordinarily complicated, requiring huge amounts of data and very powerful computers to handle their training — something that’s now finally possible given the tremendous recent progress in graphics processing units and parallel processing.

“Computer-vision systems powered by deep learning are being developed for a range of applications,” notes the report. “The technology is making self-driving cars safer by enhancing the ability to recognize pedestrians. Insurers are starting to apply deep-learning tools to assess damage to cars. In the security camera industry, [deep learning networks] are making it possible to understanding crowd behavior, which will make public places and airports safer. In agriculture, deep-learning applications can be used to predict crop yields, monitor water levels and help detect crop diseases before they spread. Deep learning for visual tasks is making some of its broadest inroads in medicine, where it can speed experts’ interpretation of scans and pathology slides and provide critical information in places that lack professionals trained to read the images — be it for screening, diagnosis, or monitoring of disease progression or response to therapy.”

Quantum Computing

Computer vision, deep learning, and related AI capabilities clearly belong in the 2017 list of Top Ten Emerging Technologies. Their impact can already be meaningfully felt, and there’s little doubt that these technologies will become increasingly commonplace over the next few years.

But frankly, I feel that it may be premature to include quantum computing in the same Top Ten list. Quantum computing is definitely a very interesting and important technology, but I think that it’s still in the research stage, rather than being ready to move to the next level.

A number of experts in the field feel otherwise. Recently, the House Science Committee held a hearing on American leadership in quantum computing. Six quantum information science experts told the Committeethat “the emerging technology’s progress is at an inflection point worldwide and more federal funding is needed to train experts and advance real-world applications.”

A few weeks ago, the Wall Street Journal published an interview with Microsoft’s co-founder and first CEO Bill Gates and its current CEO Satya Nadella that added fuel to my doubts about the maturity of quantum computing. The interview covered a variety of topics, including the value of empathy in business, the perils of automation, immigration policy and even cricket. But perhaps the most intriguing question they were asked was “Can you explain in one sentence to my 72-year old mother: What is quantum computing?”

“I don’t think so. I wish I could,” replied Nadella. He then added that we need breakthroughs like quantum computing to keep up with the exponential growth in computing power needed to solve ever more complex problems, including climate, food production and drug discovery.

“That’s the one part of Microsoft where they put up slides that I truly do not understand,” said Bill Gates . “I know a lot of physics and a lot of math. But the one place where they put up slides and it is hieroglyphics, it’s quantum.”

Most everyone will agree that Bill Gates and Satya Nadella are among the most savvy technical executives in the IT industry. So, the fact that neither was able to explain what quantum computing was about is, for me, a strong indicator that, for now, the technology is still in the research stage. In my experience, when technologies are ready to transition from research labs to early adopters, they’re generally better understood by experts like Nadella and Gates than is the case with quantum computing today. The situation might well be quite different in a few short years.

“Quantum computers tackle problems by harnessing the power of quantum mechanics,” explained the Top Ten report. “Rather than considering each possible solution one at a time, as a classical machine would, they behave in ways that cannot be explained with classical analogies. They start out in a quantum superposition of all possible solutions, and then they use entanglement and quantum interference to home in on the correct answer — processes that we do not observe in our everyday lives. The promise they offer, however, comes at the cost of them being difficult to build. A popular design requires superconducting materials (kept 100 times colder than outer space), exquisite control over delicate quantum states and shielding for the processor to keep out even a single stray ray of light…”

“There are still many obstacles. Coherence times must improve, quantum error rates must decrease, and eventually, we must mitigate or correct the errors that do occur. Researchers will continue to drive innovations in both the hardware and software. Investigators disagree, however, over which criteria should determine when quantum computing has achieved technological maturity. Some have proposed a standard defined by the ability to perform a scientific measurement so obscure that it is not easily explained to a general audience. I and others disagree, arguing that quantum computing will not have emerged as a technology until it can solve problems that have commercial, intellectual and societal importance. The good news is, that day is finally within our sights.”

I truly hope so.

This blog first appeared Nov. 13, 2017, here.

--

--

MIT IDE
MIT Initiative on the Digital Economy

Addressing one of the most critical issues of our time: the impact of digital technology on businesses, the economy, and society.