StarCraft II. Image courtesy of Blizzard.

DeepMind’s AI learned to play StarCraft. Next, Tesla’s AI will try to learn to drive.

Scaling up machine learning with massive training datasets

Yarrow Bouchard
10 min readFeb 5, 2019

--

AlphaStar, a new AI from Google‘s DeepMind, has learned to play StarCraft II. AlphaStar defeated MaNa – one of the top professional StarCraft players in the world – in five consecutive games in December. It remains an open question whether AlphaStar won through sheer mechanical superiority – the speed and precision of its clicks and keystrokes – rather than good strategy and tactics. MaNa defeated a new version AlphaStar in a live match in January. The new version was deprived of an unfair advantage: the ability to see the entire game map at the same time.

Some StarCraft fans have complained that, despite DeepMind’s assurances, AlphaStar is able to execute a superhuman number of clicks and keystrokes. With precise enough unit control, it’s possible to have absurd, godlike power in StarCraft. MaNa already won when one unfair advantage was removed. If the mechanical aspects of AlphaStar were truly limited to human levels, its competitive strength against humans might falter. Hopefully DeepMind will put this to the test.

--

--