G-SYNC 101: G-SYNC vs. V-SYNC OFF


Beyond the Limits of the Scanout

It’s already been established that single, tear-free frame delivery is limited by the scanout, and V-SYNC OFF can defeat it by allowing more than one frame scan per scanout. That said, how much of an input lag advantage can be had over G-SYNC, and how high must the framerate be sustained above the refresh rate to diminish tearing artifacts and justify the difference?

Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings

Quite high. Counting first on-screen reactions, V-SYNC OFF already has a slight input lag advantage (up to a 1/2 frame) over G-SYNC at the same framerate, especially the lower the refresh rate, but it actually takes a considerable increase in framerate above the given refresh rate to widen the gap to significant levels. And while the reductions may look significant in bar chart form, even with framerates in excess of 3x the refresh rate, and when measured at middle screen (crosshair-level) only, V-SYNC OFF actually has a limited advantage over G-SYNC in practice, and most of it is in areas that one could argue, for the average player, are comparatively useless when something such as a viewmodel’s wrist is updated 1-3ms faster with V-SYNC OFF.

This is where the refresh rate/sustained framerate ratio factors in:

Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings

As shown in the above diagrams, the true advantage comes when V-SYNC OFF can allow not just two, but multiple frame scans in a single scanout. Unlike syncing solutions, with V-SYNC OFF, the frametime is not paced to the scanout, and a frame will begin scanning in as soon as it’s rendered, regardless whether the previous frame scan is still in progress. At 144Hz with 1000 FPS, for instance, this means with a sustained frametime of 1ms, the display updates nearly 7 times in a single scanout.

In fact, at 240Hz, first on-screen reactions became so fast at 1000 FPS and 0 FPS, that the inherit delay in my mouse and display became the bottleneck for minimum measurements.

So, for competitive players, V-SYNC OFF still reigns supreme in the input lag realm, especially if sustained framerates can exceed the refresh rate by 5x or more. However, while at higher refresh rates, visible tearing artifacts are all but eliminated at these ratios, it can instead manifest as microstutter, and thus, even at its best, V-SYNC OFF still can’t match the consistency of G-SYNC frame delivery.



1365 Comments For “G-SYNC 101”

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Sort by:   newest | oldest | most liked
oliverds
Member
oliverds

Hello Jorimt,

i am a bit confused about the combination of G-Sync and V-Sync.

I am running the following system:

Intel 8700K @ 5 GHz / Gigabyte Aorus Extreme 2080ti Waterforce / 32 GB RAM DDR4-3000 / LG 34UC89B with native G-Sync.

I usually do simracing, espeacially with Assetto Corsa, which is a very CPU-intense simulation (only uses one core), so that the cpu can often reach 94% occupancy when having 50-70 cars on the track.

To reduce the cpuz occupancy, i use a framelimit of 70, in some situations the fps can go down to 55, but this does not bother me.

In the NVCP, i set G-Sync on and use the max. aviable refresh-rate of the monitor. Beforce beeing confronted with several videos, i was using G-Sync together with Fastsync activated in the NVCP, but now i think that this combination isn’t really the best.

What is the best combination of a low input-lag and acceptable cpu-occupancy? G-Sync with activated V-Sync and my 70 FPS-cap? This is really confusing.

With best regards and thanks in advance,

Oliver

TheBeker1
Member
TheBeker1

I’m having trouble with micro stutter, present in all games, i followed the guide, and even tried to clean install drivers and windows, but the stutter is still present, even in the NVIDIA Pendulum demo, the stutter happens inconsistenly, but and of course, there’s a framerate fluctuation that happens with it, i have a 1080Ti and a 9600k, 16gb of ram, and a 1tb SSD.

One of my friends has the same problem, and i’ve seen that is not uncommon on forums, can i blame faulty hardware. or is there something more i can try?

gzmm
Member
gzmm

Hey, i have a 144hz gsync compatible monitor, with 48~144hz range.
If i put the monitor at 120hz, the gsync still work? This will make a 60 fps game looks smooth?

andro92
Member
andro92

First of all thank you for your amazing work.

Now if i understand it all correctly, the optimum usage is this:

Nvcp vsync on, gsync on, frame cap -3/141fps (140 for me thanks to my ocd 🙂 i dont wanna bother with additional app rtss. The latest nvcp frame cap should be on par right ? So set and forget situation from nvcp.

Now my confusion comes from ullm. I have a good cpu. 9900k overclocked to fixed 4.9ghz at all times. Im using windows high perf power plan as well. Gpu is 2080s at 1440p. 32 gig 3733mhz cl16 ram, m2 ssd, clean windows 10 pro without bloatware (again thanks to my ocd). Also i have custom water loop so temps are all ok.

If i set ullm on or ultra all times(set & forget) will i see any negative effect ?

Sample1: game runs at 40-80 fps, gpu bound
Sample2: game runs at set fps cap 140, no gpu bound
Sample3: game runs at 60fps internal limit(mortal kombat 11, ds3, sekiro) no gpu bound.
Sample4: no sync, fixed refresh rate, gpu bound vs no gpu bound.
Note: cpu usage is always low, mostly below 30 but i understand system hiccups can happen even with the cleanest systems.

And what is the difference between on/ultra/ingame reflex ?

Effect of nvcp power management mode ? Should i leave it at default or choose maximum performance. Its ok if it consumes more power.

I’m fan of set&forget type of usage and i dont care if ullm ultra adds another fps limit below my fps limit for some specific game. But i dont want extra input lag or some bug.

My second question:

I was competitive Overwatch player, now i play Mortal Kombat 11 competitively. Motion blur, backlight strobing etc not as important in this but input lag is important. Even 1 frame matters in this game to take your turn. This game designed around 60 fps and it does not strain my gpu however it fluctates between 58-60. What settings are best for this specific game ? Can ullm on or ultra add more input lag if i set&forget ?

And the last topic sorry for wall of text:

Now gsync monitors are 30-144, gsync compatibles are 48-144. My question is how seamless is this this “lfc” or hz doubling/tripling. If my game runs between 40-60 fps is it better to just disable gsync for more fluid experience. What happens at the exact moment when the lfc kicks in ? What is the difference between adaptive sync and gsync module in this topic ?

blabliblu
Member
blabliblu

Thank you for this article.

The main reason I wanted to enable G-Sync (Freesync, in my case) was because playing AC: Valhalla on 4K on my RTX 3080 with V-Sync on causes heavy stuttering when the game drops below 60 FPS, while without V-Sync there’s massive tearing action going on (every 2-3 seconds, it’s horrible.)

Now, I might be missing something here, and I understand what’s written in the article, but if I enable G-Sync, and enable V-Sync on top of it in the Nvidia Control Panel, I still suffer from the same exact stuttering when the game drops below 60 FPS. And indeed, disabling V-Sync stopped the game from stuttering and fixed the issue, with the occasional and very rare tearing compared to a non-V-Sync experience, which is a trade-off I can live with.

So, what gives? Should I just keep V-Sync off, then? Or maybe set my V-Sync to “Fast” in the Nvidia Control Panel?

wpDiscuz