A brief and simple introduction to GPGPU

Edith Puclla
Katsuhi Code
Published in
3 min readJan 29, 2019

Hi! I'm Edith. Welcome to my blog. Here’s my first entry, especially with a topic that has me curious and passionate to know more about.

Since I started working for BlazingSQL, I have heard more about GPUs. Before this, I knew GPUs to be graphics cards that the gamers needed for video games.

This was my favorite game when I was a girl.

Image1: GTX 980 Ti 4K GPU (from Street Fighter 5 PC Gameplay)

But no…the GPU does much more than just graphics processing. Initially, the main purpose of the GPU was to render 3D games (Generate a realistic photo image from a 3D model). Over time this component evolved to become a GPU with a general purpose.

Do we remember what a GPU is and how it works?

A GPU is a microprocessor. It does specific tasks like quick mathematical calculations to manipulate and display images on the computer.

The GPUs are composed of memory, capacitors, voltage regulators, power supplies and other components that some of us like to manipulate. A GPU has everything needed to be programmable, so what is the difference with the CPU?

A CPU can sequentially execute processes with multiple cores and must wait for it to finish in order to start a new one. A GPU has thousands of cores that are capable of performing thousands of mathematical operations in parallel.

Image2: the difference between CPU and GPU cores (from katsuhi galleries)

This is why the GPU is a very powerful computational device. There are several types of GPU, if you are thinking about buying a machine with GPU, keep in mind some types:

GPUs cost more than CPUs, for example, the ones I usually use in the Google Cloud Platform are type K80, P100 or V100. You can review a summary of the costs at the end of this page: Google Cloud

General Purpose Graphical Processing Unit

NVIDIA was the first company to develop the World’s First Graphics Processing Unit. The GeForce 256 GPU — Nvidia on August 31, 1999. Over the years, NVIDIA made this product thinner, and hence introduced CUDA (which allows developers to do their own “Coding algorithms in GPU”) and develop applications with GPUs used for general purposes:

  • Machine learning
  • Deep learning
  • Artificial Intelligence
  • Data Science
  • Predictive Analytics
  • Bigdata

All this makes it possible to get the most out of the computational capabilities of GPUs.

Before finishing the article, I would like to share a very funny video in which Adam Savage and Jamie Hyneman of Mythbusters simulate the parallel processing of the GPUs.

Video1: GPU versus CPU (December 04 - 2009 for NVIDIA)

Once again, a cordial greeting to all! ;)

--

--

Edith Puclla
Katsuhi Code

Software Engineer at 42 Silicon Valley | Computer addict | Container Enthusiast, interested in DevOps, Cloud, ML, and IoT.