Too good to be true right? TLDR: limiting the game’s framerate and enabling vsync effectively disables buffer queueing and thus input lag. Read the guide here.
The problem: No tearing VS lag
When playing games on PC, two are the most common problems: input lag, which is the delay between hitting a button and seeing the result on screen and tearing, which is when the image is split vertically. Tearing is caused by the fact that the GPU renders frames in a different rate that the rate they are scanned and sent to the monitor.
The oldest solution that solves tearing is VSYNC, which in a very simple explanation makes sure the buffer from which the monitor reads is not swapped while it’s read. The game can continue to render frames as fast as possible and submit them to the buffer queue. When the queue is full, the game can’t submit another frame and hence it gets framerate limited. Note that due to the fact that more than 1 frame is queued, there will be a delay between the game rendering the frame and the GPU sending it to the monitor. Thus, input lag is increased. There’s a more in depth explanation of the problem and the solution, which will be presented below, in this post.
GSync and FreeSync (combined with frame limiting to 1 or 2 frames below the screen’s refresh rate) solve this problem by marginally increasing input lag. However, I don’t currently have such a monitor.
FastSync is another solution that makes sure that the GPU renders frames as fast as possible and the latest will be the one to send to the monitor. While sounds like the ultimate solution, in reality it onlyworks well when the GPU can achieve a framerate that is at least twice the screen’s refresh rate. Otherwise, judder is produced.
The idea is simple. We want to make sure that when the game renders a frame to a buffer, it will be the next buffer to be shown on screen and not queued behind another one. The way to do that: keeping the game’s framerate below the screen refresh rate.
For example, if a monitor has a refresh rate of 59,95 Hz (it’s shown as 59Hz in windows settings) and we limit the game framerate to 58 fps we can achieve “single buffer” VSYNC. However, we will render 1 frame less per second. The monitor will display one frame twice in one second and this will result in noticeable judder.
To avoid that, we want to lower the framerate marginally to something like: 59,94 fps. This way we get “single buffer” VSYNC and no judder (actually the monitor will repeat 1 frame in 100 seconds). To limit a game’s framerate, I propose the usage of RTSS. Even games that have limiters, don’t provide decimal precision, which is necessary in our case. Then, VSYNC can be enabled from within the game or the GPU’s options.
If you want to try the solution yourself, follow the detailed guide here.
Time for some testing. Input lag is really easy to “feel” in first person shooter games. So, I tried running Overwatch with VSYNC on. The lag was really noticable. I enabled framerate limiting via RTSS and I was amazed to feel the lag reduction instantly. However, after 1 minute I noticed the lag again. I guess that 1 more buffer got queued. So, I decreased my framerate limit to 59,930 fps and the problem was fixed.
I also tried FastSync and limited the framerate to 240 fps. I could feel that the input lag was even lower than the proposed solution. And of course, input lag was the lowest when no sync was used. Setting a framerate limit to 59 or 240 fps (with No Sync) was faster that the syncing methods.
I will continue to play Overwatch with No Sync because input lag is of outmost importance. The good think with first person shooters is that tearing is not very visible, especially when the framerate exceeds the refresh rate.
Cuphead is a 2D scrolling game where tearing is really easy to notice. On the other hand, input lag is important because of the fast nature of the game.
FastSync could not be used because the game frame limits itself and juddering was noticable.
The proposed solution worked perfectly. I can now enjoy the game without tearing or judder while it stays really responsive
GSYNC VS FastSync VS VSYNC
Even though I didn’t take measurements, I could feel that from lowest to highest the input lag goes as follows:
No Sync < No Sync + framelimit < FastSync < VSync + framelimit << VSYNC
I browsed through the measurements in BlurBusters and merged them in the following diagram. So, for 60Hz and Overwatch:
No Sync + 58fps limit: 34 ms | 0
GSync + 58fps limit: 37 ms | +3ms
Fast Sync: 39 ms | +5ms
VSync + 58 fps limit: 48 ms | +14ms
Vsync: 98 ms | +64ms
The proposed method (VSYNC + limit) is 50ms faster than VSYNC.
The added delay from No Sync is just 14ms. Note that there were no measurements from “No Sync” without frame limit, but judging from CSGO measurements (300fps limit), there’s 10ms lag involved between “No Sync 300fps VS No Sync 58fps”. So the added lag of “ No Sync 300 fps VS VSYNC 58fps” should be around: 10 + 14 = 24 ms.
For the solution to work, your PC must be able to render frames at the specified framerate consistently. If it’s only able to render between 55–65 fps, this solution will not work for a 60Hz monitor.
The solution that is described here is not related to “maximum pre-rendered frames” setting in the nVidia control panel. This is a separate control that is not related to the buffer queue used by VSYNC. Setting this to “1” can additionally help in reducing the input lag.
In order to test whether RTSS is actually working and that you have chosen the correct framerate limit, don’t enable VSYNC just yet. Notice at which point of the screen you see the tearing. If it’s working correctly, the tearing should stay at the same vertical position and move downwards really slowly. If it stays in the same position for ever or moves upwards, then your framerate is higher that the screen’s refresh rate and when you enable VSYNC 2 buffers will be queued. If it’s changing position rapidly, framerate limiting doesn’t work as expected.