G-SYNC 101: G-SYNC Ceiling vs. V-SYNC


Identical or Fraternal?

As described in G-SYNC 101: Range, G-SYNC doesn’t actually become double buffer V-SYNC above its range (nor does V-SYNC take over), but instead, G-SYNC mimics V-SYNC behavior when it can no longer adjust the refresh rate to the framerate. So, when G-SYNC hits or exceeds its ceiling, how close is it to behaving like standalone V-SYNC?

Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings

Pretty close. However, the G-SYNC numbers do show a reduction, mainly in the minimum and averages across refresh rates. Why? It boils down to how G-SYNC and V-SYNC behavior differ whenever the framerate falls (even for a moment) below the maximum refresh rate. With double buffer V-SYNC, a fixed frame delivery window is missed and the framerate is locked to half the refresh rate by a repeated frame, maintaining extra latency, whereas G-SYNC adjusts the refresh rate to the framerate in the same instance, eliminating latency.

As for “triple buffer” V-SYNC, while the subject won’t be delved into here due to the fact that G-SYNC is based on a double buffer, the name actually encompasses two entirely separate methods; the first should be considered “alt” triple buffer V-SYNC, and is the method featured in the majority of modern games. Unlike double buffer V-SYNC, it prevents the lock to half the refresh rate when the framerate falls below it, but in turn, adds 1 frame of delay over double buffer V-SYNC when the framerate exceeds the refresh rate; if double buffer adds 2-6 frames of delay, for instance, this method would add 3-7 frames.

“True” triple buffer V-SYNC, like “alt,” prevents the lock to half the refresh rate, but unlike “alt,” can actually reduce V-SYNC latency when the framerate exceeds the refresh rate. This “true” method is rarely used, and its availability, in part, can depend on the game engine’s API (OpenGL, DirectX, etc).

A form of this “true” method is implemented by the DWM (Desktop Window Manager) for borderless and windowed mode, and by Fast Sync, both of which will be explained in more detail further on.

Suffice to say, even at its worst, G-SYNC beats V-SYNC.



3140 Comments For “G-SYNC 101”

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Sort by:   newest | oldest | most liked
taef
Member
taef

so I have 540hz monitor but it’s 500hz bc I’m using windows 10, I’m playing fortnite and for so long I struggled with drop frames, even with low settings and low usage of both cpu and gpu, I’ve tried your method gsync+vsync and capped frames. the game running soo much better. thanks to you. but I’m still experiencing some drop frames but not as much as before, what should I do at this point. and what to cap my frames at?

voicon
Member
voicon

If my screen is 165Hz, and I play Pugb Moblie on Gameloop at only 120 Fps, how do I optimize the latency?
Currently I still have lag when shooting

Skirtap
Member
Skirtap

Some poeple recommend 141 fps, some 142 fps, which one should I choose? Have you noticed any screen tearing etc. with the 142 fps option?

claulo
Member
claulo

Hello
I acquired a Samsung G6 OLED monitor, with 360Hz. My pc has the following components: 4080s tuf, 78003dx, asrock 650m pg wifi, xflare ddr5 16×32 cl32 6000mhz, thermalright 850w platinum. I’ve noticed that in some games, there are some jumps between fps that are annoying to look at. For example, in a plague tale requiem, all ultra in 2k, I range between 95 and 130. It’s not stuttering, but it is annoying. I wanted to know what type of configuration I use, and if it is necessary to cap the fps in the nvidia panel.

Zehdah
Member
Zehdah

Hey, so I have a 3080, playing on a 240hz G sync monitor, I followed this guide (G sync on, V sync on in Nvidia control panel, off in game, cap FPS to 3 below monitor) and it has worked smoothly for every game. But in unoptimised games like recent releases such as The First Descendant, what do you do to optimise the game? My GPU is running at max, 300-320W, 70 degrees, 99% load, in these games, in other games I’ve never reached such power draw even at 200+ FPS. Wondering if I should cap at like 120 since the game reaches around 125 max.

wpDiscuz