Human Learning Journal: Thanks gamers

Valentin Kindschi
impactIA
Published in
4 min readNov 23, 2020

This weekly journal reports my up and downs as a machine learning and robotics intern at the impactIA Foundation. Try to have fun reading it, but know that I have little fun writing it. You have been warned.

Thanks to my work at impactIA, I was recently able to afford a new graphic card for my gaming computer. I can now play any recent game with the best graphics settings, taunt my friends, but more importantly train new AIs very quickly ! But how does it work ? How did the gaming industry participate in the rise of machine learning ?

Machine learning or artificial intelligence in general has been growing exponentially over the past years. There is a hype around it that makes every year more student enrol in AI courses. Moreover, with the democratization of free online courses, anyone can easily learn to create their first neural network in a few weeks. Even in industry, digitalization and AI-based tools are unavoidable and will be the standard in a few years.

In parallel, following Moore’s law carefully, the graphics card industry created more and more powerful products every year, especially in 2020 with NVIDIA latest 3000 series and AMD Radeon 6000 series. Upcoming games are not only smoother at higher resolution, but uses novel rendering techniques to simulate light beams, which make games a lot more realistic. All of this is nice, I can play pretty games, but what makes graphic cards so important for AI ?

New NVIDIA GPUs capabilities (source: NVIDIA)

In AI, and particularly when working with deep neural networks, the longest task is the training phase. During training, one needs to multiply inputs with every neuron’s weight and so, for the millions or billions of neurons in the network. Then, update the weights and do it again and again until the weights do not move anymore and the network perform reasonably. These are not complex computations, they are mostly multiplications and additions. A lot of them.

Simplified single neuron diagram

Historically, the processor (or CPU, for Central Processing Unit) is considered to be the heart of a computer. It handles all the computations that happens “behind the scenes”, such as opening files, running programs, writing or reading the memory. A CPU can make complex operations very quickly and efficiently. However, it is not very good at doing computations in parallel and will address them almost one at the time.

With the growing size and resolution of monitors, it became more difficult for the CPU to handle all the work. In fact, to render an image on a screen, a processor need to regularly compute new values for each pixel therefore, the bigger the screen, the more pixels, the more work for the processor.

This is where the graphic cards take place. A graphic card (or GPU, for Graphics Processing Unit) is a processor whose primary job is to render images. Its goal is to update the maximum of pixels as quickly as possible, and the best way to do it is to render a lot of pixels simultaneously. Therefore, a GPU will make a lot of “simple” computations at the same time, instead of computing on pixel at the time.

To summarize, a CPU is most of the time more powerful than a GPU, but compute very little operation in parallel whereas, a GPU is very good at making a lot of computation at the same time. This is why using GPUs is a lot more efficient than CPUs to train neural networks, since they are able to update a lot of weights simultaneously and will not process one neuron at a time.

This is how, thanks to the gamers who always ask for better and smoother graphics and pushed GPU companies to make better and more efficient products, we now have the required power to train amazing AIs in seconds instead of years. It also led NVIDIA to become one of the biggest company for AI hardware, as they develop more and more products designed especially for computer vision and optimized for machine learning workflow.

--

--

Valentin Kindschi
impactIA
Editor for

EPF Engineer in robotics, I am currently working in robotics and AI development at ImpactIA in Geneva, Switzerland.