A.I. Articles of the Week, Dec. 2017 #4

Shan Tang
BuzzRobot
Published in
4 min readDec 25, 2017
Anadolu Agency/Getty Images

The Most 2017 Story of 2017

It’s a Christmas tale for our time: Cyber nerds using high-tech software to buy a slew of baby-monkey robots and holding them ransom for thousands of dollars.

This year the world woke up to the society-shifting power of artificial intelligence

And, naturally, 2018 will bring more AI research. Autonomous cars will get more precise, facial recognition will get spookier, but the question the technology raises will remain the same.

Thoughts on David Donoho’s “Fifty Years of Data Science”

This post was originally published as part of a collection of discussion pieces on David Donoho’s paper. The original paper and collection of discussions can be found at the JCGS web site.

Two Myths About Automation

While many people believe that technological progress and job destruction are accelerating dramatically, there is no evidence of either trend. In reality, total factor productivity, the best summary measure of the pace of technical change, has been stagnating since 2005 in the US and across the advanced-country world.

Google Has a New Plan for China (and It’s Not About Search)

More than seven years after exiting China, Google is taking the boldest steps yet to come back. And it’s not with a search engine.

Building A.I. That Can Build A.I.

Google and others, fighting for a small pool of researchers, are looking for automated ways to deal with a shortage of artificial intelligence experts.

188 examples of artificial intelligence in action

Because animating the poo emoji is only the beginning.

How many images do you need to train a neural network?

Today I got an email with a question I’ve heard many times — “How many images do I need to train my classifier?“. In the early days I would reply with the technically most correct, but also useless answer of “it depends”, but over the last couple of years I’ve realized that just having a very approximate rule of thumb is useful, so here it is for posterity: You need 1,000 representative images for each class.

Superhuman AI for heads-up no-limit poker: Libratus beats top professionals

Abstract: No-limit Texas hold’em is the most popular form of poker. Despite AI successes in perfect-information games, the private information and massive game tree have made no-limit poker difficult to tackle. We present Libratus, an AI that, in a 120,000-hand competition, defeated four top human specialist professionals in heads-up no-limit Texas hold’em, the leading benchmark and long-standing challenge problem in imperfect-information game solving. Our game-theoretic approach features application-independent techniques: an algorithm for computing a blueprint for the overall strategy, an algorithm that fleshes out the details of the strategy for subgames that are reached during play, and a self-improver algorithm that fixes potential weaknesses that opponents have identified in the blueprint strategy.

Applied Machine Learning at Facebook: A Datacenter Infrastructure Perspective

Abstract: Machine learning sits at the core of many essential products and services at Facebook. This paper describes the hardware and software infrastructure that supports machine learning at global scale. Facebook’s machine learning workloads are extremely diverse: services require many different types of models in practice. This diversity has implications at all layers in the system stack. In addition, a sizable fraction of all data stored at Facebook flows through machine learning pipelines, presenting significant challenges in delivering data to high-performance distributed training flows. Computational requirements are also intense, leveraging both GPU and CPU platforms for training and abundant CPU capacity for real-time inference. Addressing these and other emerging challenges continues to require diverse efforts that span machine learning algorithms, software, and hardware design.

A List of Chip/IP for Deep Learning (keep updating)

Machine Learning, especially Deep Learning technology is driving the evolution of artificial intelligence (AI). At the beginning, deep learning has primarily been a software play. Start from the year 2016, the need for more efficient hardware acceleration of AI/ML/DL was recognized in academia and industry. This year, we saw more and more players, including world’s top semiconductor companies as well as a number of startups, even tech giants Google, have jumped into the race. I believe that it could be very interesting to look at them together. So, I build this list of AI/ML/DL ICs and IPs on Github and keep updating. If you have any suggestion or new information, please let me know.

Weekly Digest Nov. 2017 #1

Weekly Digest Nov. 2017 #2

Weekly Digest Nov. 2017 #3

Weekly Digest Nov. 2017 #4

Weekly Digest Dec. 2017 #1

Weekly Digest Dec. 2017 #2

Weekly Digest Dec. 2017 #3

--

--

Shan Tang
BuzzRobot

Since 2000, I worked as engineer, architect or manager in different types of IC projects. From mid-2016, I started working on hardware for Deep Learning.