G-SYNC 101: G-SYNC vs. V-SYNC OFF


Beyond the Limits of the Scanout

It’s already been established that single, tear-free frame delivery is limited by the scanout, and V-SYNC OFF can defeat it by allowing more than one frame scan per scanout. That said, how much of an input lag advantage can be had over G-SYNC, and how high must the framerate be sustained above the refresh rate to diminish tearing artifacts and justify the difference?

Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings

Quite high. Counting first on-screen reactions, V-SYNC OFF already has a slight input lag advantage (up to a 1/2 frame) over G-SYNC at the same framerate, especially the lower the refresh rate, but it actually takes a considerable increase in framerate above the given refresh rate to widen the gap to significant levels. And while the reductions may look significant in bar chart form, even with framerates in excess of 3x the refresh rate, and when measured at middle screen (crosshair-level) only, V-SYNC OFF actually has a limited advantage over G-SYNC in practice, and most of it is in areas that one could argue, for the average player, are comparatively useless when something such as a viewmodel’s wrist is updated 1-3ms faster with V-SYNC OFF.

This is where the refresh rate/sustained framerate ratio factors in:

Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings

As shown in the above diagrams, the true advantage comes when V-SYNC OFF can allow not just two, but multiple frame scans in a single scanout. Unlike syncing solutions, with V-SYNC OFF, the frametime is not paced to the scanout, and a frame will begin scanning in as soon as it’s rendered, regardless whether the previous frame scan is still in progress. At 144Hz with 1000 FPS, for instance, this means with a sustained frametime of 1ms, the display updates nearly 7 times in a single scanout.

In fact, at 240Hz, first on-screen reactions became so fast at 1000 FPS and 0 FPS, that the inherit delay in my mouse and display became the bottleneck for minimum measurements.

So, for competitive players, V-SYNC OFF still reigns supreme in the input lag realm, especially if sustained framerates can exceed the refresh rate by 5x or more. However, while at higher refresh rates, visible tearing artifacts are all but eliminated at these ratios, it can instead manifest as microstutter, and thus, even at its best, V-SYNC OFF still can’t match the consistency of G-SYNC frame delivery.



3140 Comments For “G-SYNC 101”

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Sort by:   newest | oldest | most liked
taef
Member
taef

so I have 540hz monitor but it’s 500hz bc I’m using windows 10, I’m playing fortnite and for so long I struggled with drop frames, even with low settings and low usage of both cpu and gpu, I’ve tried your method gsync+vsync and capped frames. the game running soo much better. thanks to you. but I’m still experiencing some drop frames but not as much as before, what should I do at this point. and what to cap my frames at?

voicon
Member
voicon

If my screen is 165Hz, and I play Pugb Moblie on Gameloop at only 120 Fps, how do I optimize the latency?
Currently I still have lag when shooting

Skirtap
Member
Skirtap

Some poeple recommend 141 fps, some 142 fps, which one should I choose? Have you noticed any screen tearing etc. with the 142 fps option?

claulo
Member
claulo

Hello
I acquired a Samsung G6 OLED monitor, with 360Hz. My pc has the following components: 4080s tuf, 78003dx, asrock 650m pg wifi, xflare ddr5 16×32 cl32 6000mhz, thermalright 850w platinum. I’ve noticed that in some games, there are some jumps between fps that are annoying to look at. For example, in a plague tale requiem, all ultra in 2k, I range between 95 and 130. It’s not stuttering, but it is annoying. I wanted to know what type of configuration I use, and if it is necessary to cap the fps in the nvidia panel.

Zehdah
Member
Zehdah

Hey, so I have a 3080, playing on a 240hz G sync monitor, I followed this guide (G sync on, V sync on in Nvidia control panel, off in game, cap FPS to 3 below monitor) and it has worked smoothly for every game. But in unoptimised games like recent releases such as The First Descendant, what do you do to optimise the game? My GPU is running at max, 300-320W, 70 degrees, 99% load, in these games, in other games I’ve never reached such power draw even at 200+ FPS. Wondering if I should cap at like 120 since the game reaches around 125 max.

wpDiscuz