G-SYNC 101: G-SYNC vs. V-SYNC OFF


Beyond the Limits of the Scanout

It’s already been established that single, tear-free frame delivery is limited by the scanout, and V-SYNC OFF can defeat it by allowing more than one frame scan per scanout. That said, how much of an input lag advantage can be had over G-SYNC, and how high must the framerate be sustained above the refresh rate to diminish tearing artifacts and justify the difference?

Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings

Quite high. Counting first on-screen reactions, V-SYNC OFF already has a slight input lag advantage (up to a 1/2 frame) over G-SYNC at the same framerate, especially the lower the refresh rate, but it actually takes a considerable increase in framerate above the given refresh rate to widen the gap to significant levels. And while the reductions may look significant in bar chart form, even with framerates in excess of 3x the refresh rate, and when measured at middle screen (crosshair-level) only, V-SYNC OFF actually has a limited advantage over G-SYNC in practice, and most of it is in areas that one could argue, for the average player, are comparatively useless when something such as a viewmodel’s wrist is updated 1-3ms faster with V-SYNC OFF.

This is where the refresh rate/sustained framerate ratio factors in:

Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings

As shown in the above diagrams, the true advantage comes when V-SYNC OFF can allow not just two, but multiple frame scans in a single scanout. Unlike syncing solutions, with V-SYNC OFF, the frametime is not paced to the scanout, and a frame will begin scanning in as soon as it’s rendered, regardless whether the previous frame scan is still in progress. At 144Hz with 1000 FPS, for instance, this means with a sustained frametime of 1ms, the display updates nearly 7 times in a single scanout.

In fact, at 240Hz, first on-screen reactions became so fast at 1000 FPS and 0 FPS, that the inherit delay in my mouse and display became the bottleneck for minimum measurements.

So, for competitive players, V-SYNC OFF still reigns supreme in the input lag realm, especially if sustained framerates can exceed the refresh rate by 5x or more. However, while at higher refresh rates, visible tearing artifacts are all but eliminated at these ratios, it can instead manifest as microstutter, and thus, even at its best, V-SYNC OFF still can’t match the consistency of G-SYNC frame delivery.



2774 Comments For “G-SYNC 101”

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Sort by:   newest | oldest | most liked
ULA
Member
ULA

Hi there!

First of all, thanks for the guide! helped a lot.

Now, to my question. I am looking to play Cyberpunk with the path-tracing settings, which is obviously a computional nightmare—even for my 4090. This would be the first time I will enable DLSS 3.0, which I will refer to as frame-generation (FG) from here on it. I will be using the perfomance-mode of DLSS 2.0, since I game at 4k and I want to mininze the input lag from FG; I believe the higher the native frame rate, the lower the input delay due to FG.

Now, from what I know, is that I should still enable V-sync from the NVIDIA control pannel. However, it is not needed to set the frame rate 3FPS below the monitor’s refersh rate, since that is automatically done by Reflex (which will be enabled by default when using FG).

However, I would still like to cap my frame rate below my native 120FPS, say at 100FPS, so that the frame rate is consistent. But, I noticed that the frame-rate limiter (both in-game and in the NVIDIA control panel) has zero affect. Is this beheviour expected?

Dogelol
Member
Dogelol

I used the Gsync on – Vsync on (in NVCP) and frame rate cap to 160 (165hz max) from RTSS however I experience varying degrees of flickering in menus mostly but in some games too.

From what I know there is no way to fix the flickering as it is something normal for VA/OLED’s.

My question is if I can not detect tearing (most likely there is some but either high fps/oled smoothness or simply my eyes not detecting it)

I am thinking about playing with Gsync off and no vsync on and just an fps cap. Would that impact my experience?

Ideally I would like to keep gsync on due to the smoothness I feel with it however the flickering is a deal breaker for me.

What is the best combination if I do not wish to use Gsync, should I keep vsync and fps cap, or only the fps cap?

Thank you!

august
Member
august

best settings for eafc 24?

rec0veryyy
Member
rec0veryyy

hi, i have been playing cs2 for several weeks, my monitor is 1440p 144hz with gsync compatible, in the nvcp i have gsync on + vsync on, i also limit the fps to 141 inside the nvcp, then inside the game vsync off and the nvidia reflex off, i get 138fps as expected and everything works fine, but i have a question, should i play like this or disable the vsync in the nvcp for cs2 to go to 200 or 300fps? because as I understand the more fps the less frametime in ms, now I would have about 7.2ms but if I unlimit it and I go to 250fps I would have 4ms, is this really so and would it make any difference?

HarmVJ
Member
HarmVJ

So windows 10 has something called VRR in the graphics setting. Should it be used in tandem with G-Sync + NVCP Vsync. Some sources and review just say to turn it on along with g-sync.

wpDiscuz