GPU vs CPU Computing: What to choose?

Olena
altumea
Published in
4 min readFeb 8, 2018

--

CPUs and GPUs have a lot in common. They are both silicon-based microprocessors. At the same time, they are substantially different, and they are deployed for different roles.

What are CPUs and GPUs?

A CPU (central processing unit) is often called the “brain” or the “heart” of a computer. It is required to run the majority of engineering and office software. However, there is a multitude of tasks that can overwhelm a computer’s central processor. That is when using GPU becomes essential for computing.

A GPU (graphics processing unit) is a specialized type of microprocessor, primarily designed for quick image rendering. GPUs appeared as a response to graphically intense applications that put a burden on the CPU and degraded computer performance. They became a way to offload those tasks from CPUs, but modern graphics processors are powerful enough to perform rapid mathematical calculations for many other purposes apart from rendering.

What is the difference?

CPUs and GPUs process tasks in different ways. Regarding interrelations, they are often compared with brain and brawn. A CPU (the brain) can work on a variety of different calculations, while a GPU (the brawn) is best at focusing all the computing abilities on a specific task. That is because a CPU consists of a few cores (up to 24) optimized for sequential serial processing. It is designed to maximize the performance of a single task within a job; however, the range of tasks is wide. On the other hand, a GPU uses thousands of smaller and more efficient cores for a massively parallel architecture aimed at handling multiple functions at the same time.

Modern GPUs provide superior processing power, memory bandwidth and efficiency over their CPU counterparts. They are 50–100 times faster in tasks that require multiple parallel processes, such as machine learning and big data analysis.

Source: blogs.nvidia.com

What problems are GPUs suited to address?

GPU computing is defined as the use of a GPU together with a CPU to accelerate scientific, analytics, engineering, consumer, and enterprise applications.

For many years, GPUs have powered the display of images and motion on computer displays, but they are technically capable of doing more. Graphics processors are brought into play when massive calculations are needed on a single task.

That task may include:

  • Games

A graphics processing unit is essential for fast, graphic-intensive rendering of the gaming world. Rendering of special effects and sophisticated 3D graphics in real time requires some serious computing power. The tasks of modern games become too heavy for CPU graphics solution. Games even made a step further with virtual reality, which is so believable because GPUs can quickly render and maintain realistic images with proper lighting and shading.

  • 3D Visualization

GPUs drive viewport performance in 3D visualization applications such as computer-aided design (CAD). Software that lets you visualize objects in 3 dimensions relies on GPUs to draw those models in real time as you rotate or move them.

  • Image Processing

GPUs can accurately process millions of images to find differences and similarities. This ability is extensively used in industries such as border control, security, and medical x-ray processing. For example, in 2010, the US military linked together more than 1,700 Sony PlayStation 3TM systems to process high-resolution satellite imagery more quickly.

  • Big Data

With thousands of computational cores and 10–100x application throughput compared to CPUs alone, graphics units are the choice for processing big data for scientists and industry. GPUs are used to depict data as interactive visualization, and they integrate with other datasets in order to explore volume and velocity of data. For example, we are now able to power up gene mapping by processing data and analyzing co-variances to understand the relationship between different combinations of genes.

  • Deep Machine Learning

Machine learning has been around for some time now, but powerful and efficient GPU computing has raised it to a new level. Deep learning is the use of sophisticated neural networks to create systems that can perform feature detection from massive amounts of unlabeled training data. GPUs can process tons of training data and train neural networks in areas like image and video analytics, speech recognition and natural language processing, self-driving cars, computer vision and much more.

GPUs are not replacements for CPU architecture. Rather, they are powerful accelerators for existing infrastructure. GPU-accelerated computing offloads compute-intensive portions of the application to the GPU, while the remainder of the code still runs on the CPU. From a user’s perspective, applications just run much faster. While general-purpose computing is still the CPU’s domain, GPUs are the hardware backbone of nearly all intensive computational applications.

--

--