The US used to be the world leader in this key area of artificial intelligence research. Now it’s China
Rosamond Hutt, Formative Content
China has overtaken the United States to become the world leader in deep learning research, a branch of artificial intelligence (AI) inspired by the human brain.
Deep learning algorithms are modeled on biological neural networks and enable machines to learn and mimic human-like responses. They’re rapidly becoming part of everyday life. Think: smartphone assistants answering your questions, or Amazon recommending products based on your search and purchasing history.
Image: The National Artificial Intelligence Research and Development Strategic Plan
The White House has released two reports that aim to help prepare the US for the growing role of artificial intelligence in society.
The National Artificial Intelligence Research and Development Strategic Plan lays out the strategy for AI funding and development in the US. It shows that since mid-2013 China has been contributing more journal articles to the field of deep learning research and has more studies cited by other researchers than the US.
In 2015, for example, Chinese researchers published around 350 articles, compared to around 260 in the US.
The report says the US will need to step up investment: “Current levels of R&D spending are half to one-quarter of the level of R&D investment that would produce the optimal level of economic growth.”
Image: CB Insights
Have you read?
- Marc Benioff: We’re on the cusp of an AI revolution
- What artificial intelligence will look like in 2030
- Artificial intelligence could transform healthcare, but we need to accept it first
In the future, deep learning algorithms could transform everything from work to warfare.
In an article for the World Economic Forum, Marc Benioff, chairman and CEO of Salesforce, explains that the convergence of big data, machine learning and increased computing power will soon make artificial intelligence “ubiquitous”.
“AI follows Albert Einstein’s dictum that genius renders simplicity from complexity,” he writes. “So, as the world itself becomes more complex, AI will become the defining technology of the twenty-first century, just as the microprocessor was in the twentieth century.”
Originally published at www.weforum.org.