Level Up!
Published in

Level Up!

5 Myths about GPU

by Bradley Nice, Content Manager at ClickHelp.com — online documentation tool

Hey there! This time we’ll be talking about another vital component of our computers — graphics processing unit, or video card, or just GPU for short.

This post will be mostly game-oriented because GPUs are mostly used for gaming. I know, I know there are a lot more ways to make use of GPU, like video editing, computing, 3D modeling, video rendering, and other stuff. But we’re not taking professional workstation solutions into account here — because they are used by professionals who know their stuff. We’re talking about gamers, who gave a start to most of the myths you can find here.

The human eye can’t perceive more than X frames per second

Frame rate (expressed in frames per second or fps) is the frequency (rate) at which consecutive images called frames appear on a display. © Wikipedia

Ugh… It creeps me out that this myth is still out there. If fps was a thing back in the medieval age, then it very well could’ve been the reason for the Crusades. In the gaming world (because mostly gamers argue about it) this myth is a stumbling rock for the community. Wars are waged, hundreds of thread pages are written on forums — and still no consensus.

It’s almost like a religion, which split at some time into several branches:

  • “Cinema Witnesses”. Those who believe that we can’t perceive more than 24 fps. This misconception arises from filmmaking. To cut a long story short — it was an ideal amount of frames for people not to perceive them as separate images, following one another, but like an actual, natural motion. And it was cheap. Why increase the number of frames up to 48 (for example, Edison made a camera that operated at a higher framerate) and double the amount of film reel needed, if 24 was good enough? So even back then it wasn’t debatable that the human eye can see more than 24 fps.
  • “Past-gen Witnesses”. They believe that 30 fps are a threshold. It probably originated from console fps limitations. Because mostly developers limit their games (we’re not talking about current-gen consoles, things are shifting towards 60 fps now) to run at 30 fps. But notice this — they limit them, not because the human eye can’t perceive more, but for various reasons like optimization.
  • Then there are several smaller groups who believe different numbers are in play — 48, 60, 70, etc.

Take a look at this video:

I bet you can clearly see the difference in slow-motion, where higher monitor refresh rate (it’s not fps, of course, but refresh rate limits the number of frames per second that your monitor can display), that the higher refresh rate, and, in turn, possible fps (we assume that at each refresh rate game is running at maximum possible fps, so in this case, fps = refresh rate) — the smoother the motion. But that’s slow motion; at normal speed, you might not notice a lot of difference between them, because frames are changing a lot faster. So let’s check this example:

I bet you can see the difference here as well, and it’s quite big. But what about facts? Well, myelinated nerves can fire between 300 and 1000 times per second, basically allowing your eye and brain to interpret up to 1000 frames per second. But that’s in theory. In reality, you probably won’t be able to tell the difference of frame rates higher than 150. And for most people, this number will be even lower, because they don’t have a trained eye (like gamers do😏) and just can’t spot that difference.

Another fact that can confirm that people actually see more than 60 fps is that VR devices work at 90hz refresh rate (PSVR at 120hz), because our peripheral vision (which we almost don’t use when sitting in front of our monitors/TVs) is much more sensitive to motion, and 60hz is not enough for our eyes to not notice stuttering caused by this refresh rate.

If you switch to a 120hz monitor after years of using 60hz, you may not spot the difference. But give it a day or two, and then try playing the 60hz panel — you will immediately see that ‘something is off’.

Bottom line is — most people can tell difference between refresh rates for up to 60hz, while for most gamers this number will be bigger, for up to ~120hz. After this, changes are barely noticeable for the majority of us.

More memory (VRAM) always increases performance

This one is quite complex. But to put things in a simple way — yes, it can increase performance, but only if a greater amount of video memory is needed (who would’ve thought, huh?).

If a game requires (and actually tries to use) more video memory, than your GPU has, then you might see a decrease in performance, and switching to the same video card, but with a higher amount of VRAM can help.

But if a game uses around 2GB of VRAM, your experience on 3GB and 6GB VRAM versions of the same graphics card will be roughly the same on the same settings.

In general, video memory is used for loading textures, holding assets required to render a frame (shadow maps and such) and quite a bunch of other stuff. Increasing amount of memory may be needed if:

  • you want to play at higher resolutions. 4k resolution will require roughly 4 times more VRAM than 1080p because the frame that needs to be rendered is 4 times bigger.
  • you want to play with higher texture resolution. Yes, textures are loaded into video memory, and yes, if you increase texture resolution (from 512x512 to 4096x4096, let’s say) you will need more VRAM. Not necessarily 4 times more, but more.
  • want to play with higher render-based antialiasing settings. I won’t go into details here about antialiasing types, but hear this: FXAA and MLAA won’t increase your VRAM consumption a single bit. While FSAA, MSAA, CSAA and CFAA, and other render-based antialiasing settings will require significantly more video memory (sometimes twice as much, or even more) because they effectively increase the number of pixels that need to be rendered.

Baking GPU in over can resurrect it

If your GPU is acting up, and you see something similar to this…

…it means that something’s wrong (duh). And on most forums, people will tell you to bake your GPU. No, I mean, literally bake it. Put it on a baking pan, fold it in tin foil and put into your oven for a couple of minutes.

What can I say… this actually works sometimes. SOMETIMES. And you should use it ONLY as a last resort, and if you’ve got nothing to lose. If it’s a new GPU that’s still on warranty — better get it to a service center. So as a short-term solution, like when you can’t buy a new video card right now — it may be okay.

But listen to my advice — don’t dabble in dark arts. Baking and resurrecting GPU is a necromancy and can get you banned from the Mages Guild. You’ve been warned.

Two GPUs will double your performance

First of all — your motherboard and video cards should support this multi-GPU technology like SLI or Crossfire. Without it, it’s no use.

But even then — you won’t get twice as much performance. It depends on the game you’re playing, but the boost will be somewhere between 0% and 80% at maximum.

You can look at this article here with a nice chart. They compared GTX 1080ti vs GTX 1080ti SLI performance. Spoiler: in some cases, SLI performed worse than a single video card.

Overclocking will damage your card

Just like with CPUs, overclocking in and of itself cannot damage GPU. If you set it to a clock speed it doesn’t support or can’t handle, it will either simply reset settings (CPUs mostly), or crash or behave inconsistently. But this causes no damage to video card itself.

And just like with CPUs, voltage and heat are the things that can actually bring harm. But nowadays, modern GPUs will either shut down or throttle in order to prevent damage. They are smart like that.

So I hope I brought some more light into this world and if this article helped at least someone — I’m happy. Let me know what you think about all this in the comment section!

Have a nice day!

Bradley Nice,
Content Manager at ClickHelp.com — best online documentation tool for SaaS vendors

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store