MACHINE LEARNING

Should you Use a GPU for Your Machine Learning Project?

Learn the main differences between using CPU and GPU for your machine learning project, and understand which to choose

Chris Verdence
2 min readAug 20, 2020

Many machine learning projects involve training an algorithm on massive amounts of data, and hence requires a lot of computing power. This computing power is typically tied to the number and quality of the processing units, such as the central processing unit (CPU) or the graphics processing unit (GPU). The performance can also be greatly affected by the number of cores in the processing units. A core is a unit that can perform computing operations, and the more cores a CPU or GPU has, the more operations it can run in parallel. Although both the CPU and GPU are delivering computing power, there are certain differences between them that make them suitable in different situations.

Most computers come with one or multiple CPUs, which perform arithmetic, logic, controlling, and input/output operations. These operations are essential for many of the day to day tasks that regular people perform on their computers. A typical CPU comes with four to eight cores, and can, therefore, run some operations in parallel.

Photo by Olivier Collet on Unsplash

The GPU, on the other hand, is well suited for manipulating computer graphics and image processing. Each GPU is typically built up of hundreds of cores, which makes it able to run a lot of operations in parallel, a trait that makes the GPU interesting for algorithms processing data in parallel. This is also the reason why GPUs have become so popular in machine learning. Many machine learning projects can be speeded up by processing data in parallel operations, and because of its large amount of cores, the GPU will be useful in such projects.

In addition to the number of cores, there are also other differences between CPUs and GPUs. GPUs typically devote proportionally more transistors to arithmetic logic units (ALUs) and fewer to caches and flow control compared to CPUs. This is also helping GPUs train your machine learning algorithm faster.

Generally, the advantage of a GPU over a CPU is more significant for projects with large amounts of computations.

  1. If your machine learning algorithm is quite small-scale, using CPUs should be sufficient
  2. If your machine learning algorithm involves a massive amount of calculations and hundreds of thousands of parameters, then you should use GPUs

[1] Jason Dsouza. What is a GPU and do you need one in Deep Learning? (Apr 2020). https://towardsdatascience.com/what-is-a-gpu-and-do-you-need-one-in-deep-learning-718b9597aa0d

[2] Boston Labs. What is GPU Computing? https://www.boston.co.uk/info/nvidia-kepler/what-is-gpu-computing.aspx

--

--

Chris Verdence

The product development guy | Giving my take on going from zero to one