G-SYNC 101: G-SYNC vs. V-SYNC OFF w/FPS Limit


At the Mercy of the Scanout

Now that the FPS limit required for G-SYNC to avoid V-SYNC-level input lag has been established, how does G-SYNC + V-SYNC and G-SYNC + V-SYNC “Off” compare to V-SYNC OFF at the same framerate?

Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings

The results show a consistent difference between the three methods across most refresh rates (240Hz is nearly equalized in any scenario), with V-SYNC OFF (G-SYNC + V-SYNC “Off,” to a lesser degree) appearing to have a slight edge over G-SYNC + V-SYNC. Why? The answer is tearing…

With any vertical synchronization method, the delivery speed of a single, tear-free frame (barring unrelated frame delay caused by many other factors) is ultimately limited by the scanout. As mentioned in G-SYNC 101: Range, The “scanout” is the total time it takes a single frame to be physically drawn, pixel by pixel, left to right, top to bottom on-screen.

With a fixed refresh rate display, both the refresh rate and scanout remain fixed at their maximum, regardless of framerate. With G-SYNC, the refresh rate is matched to the framerate, and while the scanout speed remains fixed, the refresh rate controls how many times the scanout is repeated per second (60 times at 60 FPS/60Hz, 45 times at 45 fps/45Hz, etc), along with the duration of the vertical blanking interval (the span between the previous and next frame scan), where G-SYNC calculates and performs all overdrive and synchronization adjustments from frame to frame.

The scanout speed itself, both on a fixed refresh rate and variable refresh rate display, is dictated by the current maximum refresh rate of the display:

Blur Buster's G-SYNC 101: Scanout Speed DiagramAs the diagram shows, the higher the refresh rate of the display, the faster the scanout speed becomes. This also explains why V-SYNC OFF’s input lag advantage, especially at the same framerate as G-SYNC, is reduced as the refresh rate increases; single frame delivery becomes faster, and V-SYNC OFF has less of an opportunity to defeat the scanout.

V-SYNC OFF can defeat the scanout by starting the scan of the next frame(s) within the previous frame’s scanout anywhere on screen, and at any given time:

Blur Buster's G-SYNC 101: Input Lag & Optimal Settings

This results in simultaneous delivery of more than one frame scan in a single scanout (tearing), but also a reduction in input lag; the amount of which is dictated by the positioning and number of tearline(s), which is further dictated by the refresh rate/sustained framerate ratio (more on this later).

As noted in G-SYNC 101: Range, G-SYNC + VSYNC “Off” (a.k.a. Adaptive G-SYNC) can have a slight input lag reduction over G-SYNC + V-SYNC as well, since it will opt for tearing instead of aligning the next frame scan to the next scanout when sudden frametime variances occur.

To eliminate tearing, G-SYNC + VSYNC is limited to completing a single frame scan per scanout, and it must follow the scanout from top to bottom, without exception. On paper, this can give the impression that G-SYNC + V-SYNC has an increase in latency over the other two methods. However, the delivery of a single, complete frame with G-SYNC + V-SYNC is actually the lowest possible, or neutral speed, and the advantage seen with V-SYNC OFF is the negative reduction in delivery speed, due to its ability to defeat the scanout.

Bottom-line, within its range, G-SYNC + V-SYNC delivers single, tear-free frames to the display the fastest the scanout allows; any faster, and tearing would be introduced.



3140 Comments For “G-SYNC 101”

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Sort by:   newest | oldest | most liked
taef
Member
taef

so I have 540hz monitor but it’s 500hz bc I’m using windows 10, I’m playing fortnite and for so long I struggled with drop frames, even with low settings and low usage of both cpu and gpu, I’ve tried your method gsync+vsync and capped frames. the game running soo much better. thanks to you. but I’m still experiencing some drop frames but not as much as before, what should I do at this point. and what to cap my frames at?

voicon
Member
voicon

If my screen is 165Hz, and I play Pugb Moblie on Gameloop at only 120 Fps, how do I optimize the latency?
Currently I still have lag when shooting

Skirtap
Member
Skirtap

Some poeple recommend 141 fps, some 142 fps, which one should I choose? Have you noticed any screen tearing etc. with the 142 fps option?

claulo
Member
claulo

Hello
I acquired a Samsung G6 OLED monitor, with 360Hz. My pc has the following components: 4080s tuf, 78003dx, asrock 650m pg wifi, xflare ddr5 16×32 cl32 6000mhz, thermalright 850w platinum. I’ve noticed that in some games, there are some jumps between fps that are annoying to look at. For example, in a plague tale requiem, all ultra in 2k, I range between 95 and 130. It’s not stuttering, but it is annoying. I wanted to know what type of configuration I use, and if it is necessary to cap the fps in the nvidia panel.

Zehdah
Member
Zehdah

Hey, so I have a 3080, playing on a 240hz G sync monitor, I followed this guide (G sync on, V sync on in Nvidia control panel, off in game, cap FPS to 3 below monitor) and it has worked smoothly for every game. But in unoptimised games like recent releases such as The First Descendant, what do you do to optimise the game? My GPU is running at max, 300-320W, 70 degrees, 99% load, in these games, in other games I’ve never reached such power draw even at 200+ FPS. Wondering if I should cap at like 120 since the game reaches around 125 max.

wpDiscuz