G-SYNC 101: G-SYNC Ceiling vs. V-SYNC


Identical or Fraternal?

As described in G-SYNC 101: Range, G-SYNC doesn’t actually become double buffer V-SYNC above its range (nor does V-SYNC take over), but instead, G-SYNC mimics V-SYNC behavior when it can no longer adjust the refresh rate to the framerate. So, when G-SYNC hits or exceeds its ceiling, how close is it to behaving like standalone V-SYNC?

Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings

Pretty close. However, the G-SYNC numbers do show a reduction, mainly in the minimum and averages across refresh rates. Why? It boils down to how G-SYNC and V-SYNC behavior differ whenever the framerate falls (even for a moment) below the maximum refresh rate. With double buffer V-SYNC, a fixed frame delivery window is missed and the framerate is locked to half the refresh rate by a repeated frame, maintaining extra latency, whereas G-SYNC adjusts the refresh rate to the framerate in the same instance, eliminating latency.

As for “triple buffer” V-SYNC, while the subject won’t be delved into here due to the fact that G-SYNC is based on a double buffer, the name actually encompasses two entirely separate methods; the first should be considered “alt” triple buffer V-SYNC, and is the method featured in the majority of modern games. Unlike double buffer V-SYNC, it prevents the lock to half the refresh rate when the framerate falls below it, but in turn, adds 1 frame of delay over double buffer V-SYNC when the framerate exceeds the refresh rate; if double buffer adds 2-6 frames of delay, for instance, this method would add 3-7 frames.

“True” triple buffer V-SYNC, like “alt,” prevents the lock to half the refresh rate, but unlike “alt,” can actually reduce V-SYNC latency when the framerate exceeds the refresh rate. This “true” method is rarely used, and its availability, in part, can depend on the game engine’s API (OpenGL, DirectX, etc).

A form of this “true” method is implemented by the DWM (Desktop Window Manager) for borderless and windowed mode, and by Fast Sync, both of which will be explained in more detail further on.

Suffice to say, even at its worst, G-SYNC beats V-SYNC.



3195 Comments For “G-SYNC 101”

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Sort by:   newest | oldest | most liked
retrospek
Member
retrospek

My monitor is 280hz but whenever i use gsync + vsync with reflex on it limits my fps to 260. should i instead use incame fps cap of 277 and turn off reflex and instead use low latency on ultra?

ThanksJorimt
Member
ThanksJorimt

I noticed in your LDAT testing reflex had the same input lag (16ms) but at a lower 224 fps than in-game frame limiter alone (237 fps). Is there a reason to use reflex when it seems like you’re paying a -13 fps penalty for effectively the same result.

Also, just a small typo I noticed on the last page (14):

… ~224 FPS @240Hz (etc) whenever the framerate can be sustained above the refesh rate, and 2) dynamically monitor and limit the framerate whenever it can’t be sustained above the refresh rate …

voicon
Member
voicon

so in summary what is the best setting for input lag (If I turn off Gsync and also Vsync)

Dvrgg
Member
Dvrgg

Hello, I wanted to ask, is it worth using the Rtss fps limit for the game (-3 fps from 165 Hz monitor), if the frame rate is not perfectly smooth when using the internal limiter? (G-sync + V-sync, -3 fps) And also, is there any point in enabling Reflex/Llm if the GPU is not loaded by 99%? Do I need to do anything else to achieve maximum smoothness even sacrificing input lag?

2xmos
Member
2xmos

Hello, I got a question

You advice to set at -3 FPS on games without Reflex but Reflex cap at -7%, so why don’t you advice to set to -7%?

Thanks

wpDiscuz