G-SYNC 101: G-SYNC vs. V-SYNC OFF w/FPS Limit


At the Mercy of the Scanout

Now that the FPS limit required for G-SYNC to avoid V-SYNC-level input lag has been established, how does G-SYNC + V-SYNC and G-SYNC + V-SYNC “Off” compare to V-SYNC OFF at the same framerate?

Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings

The results show a consistent difference between the three methods across most refresh rates (240Hz is nearly equalized in any scenario), with V-SYNC OFF (G-SYNC + V-SYNC “Off,” to a lesser degree) appearing to have a slight edge over G-SYNC + V-SYNC. Why? The answer is tearing…

With any vertical synchronization method, the delivery speed of a single, tear-free frame (barring unrelated frame delay caused by many other factors) is ultimately limited by the scanout. As mentioned in G-SYNC 101: Range, The “scanout” is the total time it takes a single frame to be physically drawn, pixel by pixel, left to right, top to bottom on-screen.

With a fixed refresh rate display, both the refresh rate and scanout remain fixed at their maximum, regardless of framerate. With G-SYNC, the refresh rate is matched to the framerate, and while the scanout speed remains fixed, the refresh rate controls how many times the scanout is repeated per second (60 times at 60 FPS/60Hz, 45 times at 45 fps/45Hz, etc), along with the duration of the vertical blanking interval (the span between the previous and next frame scan), where G-SYNC calculates and performs all overdrive and synchronization adjustments from frame to frame.

The scanout speed itself, both on a fixed refresh rate and variable refresh rate display, is dictated by the current maximum refresh rate of the display:

Blur Buster's G-SYNC 101: Scanout Speed DiagramAs the diagram shows, the higher the refresh rate of the display, the faster the scanout speed becomes. This also explains why V-SYNC OFF’s input lag advantage, especially at the same framerate as G-SYNC, is reduced as the refresh rate increases; single frame delivery becomes faster, and V-SYNC OFF has less of an opportunity to defeat the scanout.

V-SYNC OFF can defeat the scanout by starting the scan of the next frame(s) within the previous frame’s scanout anywhere on screen, and at any given time:

Blur Buster's G-SYNC 101: Input Lag & Optimal Settings

This results in simultaneous delivery of more than one frame scan in a single scanout (tearing), but also a reduction in input lag; the amount of which is dictated by the positioning and number of tearline(s), which is further dictated by the refresh rate/sustained framerate ratio (more on this later).

As noted in G-SYNC 101: Range, G-SYNC + VSYNC “Off” (a.k.a. Adaptive G-SYNC) can have a slight input lag reduction over G-SYNC + V-SYNC as well, since it will opt for tearing instead of aligning the next frame scan to the next scanout when sudden frametime variances occur.

To eliminate tearing, G-SYNC + VSYNC is limited to completing a single frame scan per scanout, and it must follow the scanout from top to bottom, without exception. On paper, this can give the impression that G-SYNC + V-SYNC has an increase in latency over the other two methods. However, the delivery of a single, complete frame with G-SYNC + V-SYNC is actually the lowest possible, or neutral speed, and the advantage seen with V-SYNC OFF is the negative reduction in delivery speed, due to its ability to defeat the scanout.

Bottom-line, within its range, G-SYNC + V-SYNC delivers single, tear-free frames to the display the fastest the scanout allows; any faster, and tearing would be introduced.



1096 Comments For “G-SYNC 101”

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Sort by:   newest | oldest | most liked
God Hands
Member
God Hands

Hi, jorimt! Thank you for creating this guide and still being active in its comments for all this time. I’m a fairly new PC owner and I have a couple of questions surrounding RTSS and a program called ISLC.

First question: I’m currently using my NVCP resolution that’s set to Ultra HD 4k native (NOT PC resolution 4k) because the LG CX TV and its HDMI 2.0 (UHD 48g).

The reason is because for some reason, the Ultra HD selection in NVCP allows RGB Full and doesn’t give me the ONLY option of YCBCR, which also doesn’t let me select Full dynamic range. I’m guessing because I’m choosing the “TV” resolution.

This is relevant because despite the UHD+ resolution looking better, more colorful and more crisp at the same resolution as the PC selection, the highest refresh rate availble is CAPPED at 60hz instead of the full 120hz I know the LG CX is capable of and can be selected in any PC resolution in the NVCP. (Side Note: what’s even stranger is that certain games seem to take advantage of the 120hz anyway despite my NVCP 60hz being selected from UHD resolution settings, meanwhile others are capped at 60.)

So my actual first question is this: since my NVCP resolution is selected as UHD AND 60hz despite being a 120hz monitor, do I set the RTSS to -3 under THAT refresh rate (which would become 57hz) or still set RTSS to 117hz for -3 because I’m using a 120hz 4ktv/monitor, despite the UHD resolution selection being 60hz in panel?

Secondly, what are your opinions on using something like ISLC or Timer Resolution(?) for decreasing input lag (and in ISLC’s case, input lag and memory) in conjunction with your G-Sync/V-Sync/RTSS instructions here? Necessary? Overkill? Incompatible? I don’t want too many cooks in the kitchen if it’s a problem. It claims to halve my 1ms response time to .5ms, but I’m not sure if there’s any hidden problems this could cause with your method that I should know about. Thank you! I admire your passion for optimal settings.

This is ISLC for reference. https://www.wagnardsoft.com/forums/viewtopic.php?f=18&t=1256&sid=ac940d134fc3d4ffe921578dc23dfb36

m_staf
Member
m_staf

Hello,

I have been having much trouble with my G-sync intermittently working for seemingly no rhyme or reason for months. I have followed all of the recommended steps for setting up Gysnc and have a Gsync certified monitor (Asus TUF VG27A). Gsync WILL work for a few of the games on my computer such as Red Dead Redemption 2 and Rise of the Tomb Raider. I can tell that it is working because of the Gsync enabled badge appearing in the game and a noticeable improvement in tearing. Gsync will not however work in games like Battlefield V or World of Warcraft classic (it previously did when I first built my computer). I figured this may be due to improper in game settings on my end at first. However, when using the Nvidia pendulum demo, I have not once been able to select the Gsync box in the demo. There are no signs of Gsync working at all in the demo and it is accompanied by terrible frame rate tearing. I can select the Vsync option in the demo (if that means anything).
I have done a clean reinstall of the drivers, tried multiple monitors, display port cables, HDMI cables, etc, default Nvidia control pannel settings. Nothing has seemed to work at all! It is incredibly frustrating as I have combed forums and boards for months looking for a solution. It is also upsetting spending the extra money for a certified Gsync monitor and still having the issues. I usually have a dual monitor setup. One monitor is a freesync Acer XF270HUA and the other is the Asus previously mentioned. I have tried testing one monitor at a time with the same issues so I do not believe it to be the fault of having dual monitors.

All settings on nvidia control panel are default besides the following

Set up G-SYNC > Enable G-SYNC, G-SYNC Compatible > Enable for full screen mode.
Manage 3D settings > Max Frame Rate 162 FPS
Manage 3D settings > Vertical sync >ON

Do you have any ideas of what could be causing the issue or inconsistency?

Thank you for your time and help!

drameloide
Member
drameloide

The results show a consistent difference between the three methods across most refresh rates (240Hz is nearly equalized in any scenario), with V-SYNC OFF (G-SYNC + V-SYNC “Off,” to a lesser degree) appearing to have a slight edge over G-SYNC + V-SYNC. Why? The answer is tearing…

about this paragraph, you said gsync + vsync off have tearing issues and this is completely wrong if you are in range of gsync, differences between vsync on + gsync and vsync off + gsync is only the plus of input lag vsync offer you

tbhee
Member
tbhee

For AMD is it correct to set the framerate limiter through Radeon Chill Min and Max = cap?

And for a game that does have an ingame limiter would it be overkill to be using that as well
as Chill?

Gsyncmasterace
Member
Gsyncmasterace

Hey there 😀

Do you guys know if there is any difference between gsync and a fixed refresh rate with an unlimited framerate ( lets say 400 fps at 240hz ).

wpDiscuz