Freesync and G-sync monitors explained

Dogtor Flashbank
4 min readMar 10, 2021

--

Do I need to buy a Freesync/G-sync monitor for gaming?
Not really. The same effect is achievable with software, see the RTSS section below.

Which is better — Freesync or G-sync?
They are brand names for basically the same thing. FreeSync is AMD’s brand name, while G-sync is Nvidia’s brand name. I’d say there are like Pepsi Cola and Coca Cola and the choice is mostly a matter of taste.

To look a bit deeper, there is a standard called AdaptiveSync, which AMD simply implements as FreeSync. Nvidia chose to go for the bucks by packaging the thing as a proprietary technology and marketing it as being better.

Some people believe that Nvidia’s solution is indeed somehow special and better. Not me. When I think of Nvidia, this image comes to my mind.

Linus Torvalds, the creator of Linux operating system, giving Nvidia a finger
Linus Torvalds, the creator of Linux operating system, giving Nvidia a finger

The guy pictured is Linus Torvalds. He’s the creator of Linux. Linux is an operating system that powers the most computer devices in the world. Android is based on Linux. Your router runs Linux. The web server that sent you this page is running Linux.

Linux can also run on a desktop computer. For that, it needs drivers for various components, including GPUs. Nvidia was making it so hard to create those drivers that Torvalds called it ”the single worst company” the Linux developer community has ever dealt with.

At the same time Nvidia is known to spend heavily on marketing. As far as I know, there are no independent tests that would suggest that G-sync monitors are better than Freesync monitors. If you are aware of any, let me know. In the meantime, I will continue to believe that G-sync’s alleged superiority is just another piece of marketing bullshit.

What is the purpose of Adaptive/Free/G-Sync?
The purpose is to eliminate screen tear. As the AdaptiveSync article says,

Computer monitors normally refresh their displays at a fixed frame rate. In gaming applications, a computer’s CPU or GPU output frame rate will vary according to the rendering complexity of the image. If a display’s refresh rate and a computer’s render rate are not synchronized, visual artifacts — tearing or stuttering — can be seen by the user.

BTW, stutter is not so much a sync issue, it’s more about a computer not being able to produce frames fast enough.

How does Adaptive/Free/G-Sync work?
In short, the monitor refreshes the screen when a new frame is available from a GPU:

Adaptive-Sync enables the display to dynamically match a GPU’s rendering rate, on a frame-by-frame basis, to produce a smoother, low latency, gaming experience.

This is all good and well as long as the GPU pumps out frames at the steady rate close to the refresh rate of the monitor. In practice, the frame times will be inconsistent — some normal, some longer — resulting in uneven input lag and probably some stutter.

I have a monitor with Freesync/G-sync, should I enable it?
Maybe. It might introduce more input lag, so it depends.

What is input lag?
Input lag is the delay between an action (like moving your mouse, or pressing a key), and updating the display with that action. Ideally it would be as short as possible.

Does Freesync/G-sync introduce input lag?
According to this video, it does sometimes. The author measured response times for various setups in one game — CS GO — and found that both FreeSync and G-sync introduce a considerable input lag if the frame rate is uncapped.

In other words, for optimal FreeSync/G-sync experience you apparently need to somehow cap your frame rate at the monitor refresh rate (or close). For example, the author of the video capped frame rate at 138 FPS for 144 Hz monitors.

But I thought the more FPS, the better?
In general yes, because the input lag will be smaller, but if the display is unsynced, you’ll get screen tear. The optimal situation is to have steady FPS equal to the monitor refresh rate, synced so that there’s no screen tear.

How do I cap the frame rate in games?
Some games allow you to do that in game settings. A good universal method is to use RTSS.

What is RTSS and what benefits does it provide?
RTSS stands for RivaTuner Statistics Server. According to its description, it provides framerate monitoring, On-Screen Display and high-performance video capture.

In plain words, RTSS is a small piece of software. Here are its two main benefits for me:

  1. It shows the current FPS on screen, and it also shows a nice FPS graph
  2. It caps the frame rate and eliminates screen tear in games

How to cap FPS with RTSS?
There are two options for that, Framerate limit and Scanline sync. Both are next to each other on the RTSS main screen. You only need to use one at a time.

To use Framerate limit, enter a framerate value at the space provided. Long story short, you want it slightly lower than your monitor’s refresh rate. For example, if you have a 144 hz monitor, enter something like 143.95.

To use Scanline sync, enter a line number where the tear should happen. You want it off the screen, so enter something like -20.

How is it better than the old v-sync?
Both eliminate screen tearing, but using RTSS is better because refreshes happen when the computer is ready with a new frame, not when the monitor is ready. The result is less input lag.

Isn’t that exactly what Adaptive/Free/G-sync does?
These technologies don’t provide steady frame times. You need to cap the FPS to stabilize frame times and minimize input lag for smooth gameplay.

--

--