G-SYNC 101: Range

Blur Buster's G-SYNC 101: Range Chart

Exceeds G-SYNC Range

G-SYNC + V-SYNC “Off”:
G-SYNC disengages, tearing begins display wide, no frame delay is added.

G-SYNC reverts to V-SYNC behavior when it can no longer adjust the refresh rate to the framerate, 2-6 frames (typically 2 frames; approximately an additional 33.2ms @60 Hz, 20ms @100 Hz, 13.8ms @144 Hz, etc) of delay is added as rendered frames begin to over-queue in both buffers, ultimately delaying their appearance on-screen.

G-SYNC + Fast Sync*:
G-SYNC disengages, Fast Sync engages, 0-1 frame of delay is added**.
*Fast Sync is best used with framerates in excess of 2x to 3x that of the display’s maximum refresh rate, as its third buffer selects from the “best” frame to display as the final render; the higher the sample rate, the better it functions. Do note, even at its most optimal, Fast Sync introduces uneven frame pacing, which can manifest as recurring microstutter.
**Refresh rate/framerate ratio dependent (see G-SYNC 101: G-SYNC vs. Fast Sync).

Within G-SYNC Range

Refer to “Upper & Lower Frametime Variances” section below…

Upper & Lower Frametime Variances

G-SYNC + V-SYNC “Off”:
The tearing inside the G-SYNC range with V-SYNC “Off” is caused by sudden frametime variances output by the system, which will vary in severity and frequency depending on both the efficiency of the given game engine, and the system’s ability (or inability) to deliver consistent frametimes.

G-SYNC + V-SYNC “Off” disables the G-SYNC module’s ability to compensate for sudden frametime variances, meaning, instead of aligning the next frame scan to the next scanout (the process that physically draws each frame, pixel by pixel, left to right, top to bottom on-screen), G-SYNC + V-SYNC “Off” will opt to start the next frame scan in the current scanout instead. This results in simultaneous delivery of more than one frame in a single scanout (tearing).

In the Upper FPS range, tearing will be limited to the bottom of the display. In the Lower FPS range (<36) where frametime spikes can occur (see What are Frametime Spikes?), full tearing will begin.

Without frametime compensation, G-SYNC functionality with V-SYNC “Off” is effectively “Adaptive G-SYNC,” and should be avoided for a tear-free experience (see G-SYNC 101: Optimal Settings & Conclusion).

This is how G-SYNC was originally intended to function. Unlike G-SYNC + V-SYNC “Off,” G-SYNC + V-SYNC “On” allows the G-SYNC module to compensate for sudden frametime variances by adhering to the scanout, which ensures the affected frame scan will complete in the current scanout before the next frame scan and scanout begin. This eliminates tearing within the G-SYNC range, in spite of the frametime variances encountered.

Frametime compensation with V-SYNC “On” is performed during the vertical blanking interval (the span between the previous and next frame scan), and, as such, does not delay single frame delivery within the G-SYNC range and is recommended for a tear-free experience (see G-SYNC 101: Optimal Settings & Conclusion).

G-SYNC + Fast Sync:
Upper FPS range: Fast Sync may engage, 1/2 to 1 frame of delay is added.
Lower FPS range: see “V-SYNC ‘On'” above.

What are Frametime Spikes?

Frametime spikes are an abrupt interruption of frames output by the system, and on a capable setup running an efficient game engine, typically occur due to loading screens, background asset streaming, network activity, and/or the triggering of a script or physics system, but can also be exacerbated by an incapable setup, inefficient game engine, poor netcode, low RAM/VRAM and page file over usage, misconfigured (or limited game support for) SLI setups, faulty drivers, specific or excess background processes, in-game overlay or input device conflicts, or a combination of them all.

Not to be confused with other performance issues, like framerate slowdown or V-SYNC-induced stutter, frametime spikes manifest as the occasional hitch or pause, and usually last for mere micro to milliseconds at a time (seconds, in the worst of cases), plummeting the framerate to as low as the single digits, and concurrently raising the frametime to upwards of 1000ms before re-normalizing.

G-SYNC eliminates traditional V-SYNC stutter caused below the maximum refresh rate by repeated frames from delayed frame delivery, but frametime spikes still affect G-SYNC, since it can only mirror what the system is outputting. As such, when G-SYNC has nothing new to sync to for a frame or frames at a time, it must repeat the previous frame(s) until the system resumes new frame(s) output, which results in the visible interruption observed as stutter.

The more efficient the game engine, and the more capable the system running it, the less frametime spikes there are (and the shorter they last), but no setup can fully avoid their occurrence.

Minimum Refresh Range

Once the framerate reaches the approximate 36 and below mark, the G-SYNC module begins inserting duplicate refreshes per frame to maintain the panel’s minimum physical refresh rate, keep the display active, and smooth motion perception. If the framerate is at 36, the refresh rate will double to 72 Hz, at 18 frames, it will triple to 54 Hz, and so on. This behavior will continue down to 1 frame per second.

Regardless of the reported framerate and variable refresh rate of the display, the scanout speed will always be a match to the display’s current maximum refresh rate; 16.6ms @60Hz, 10ms @100 Hz, 6.9ms @144 Hz, and so on. G-SYNC’s ability to detach framerate and refresh rate from the scanout speed can have benefits such as faster frame delivery and reduced input lag on high refresh rate displays at lower fixed framerates (see G-SYNC 101: Hidden Benefits of High Refresh Rate G-SYNC).

1096 Comments For “G-SYNC 101”

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Sort by:   newest | oldest | most liked
God Hands
God Hands

Hi, jorimt! Thank you for creating this guide and still being active in its comments for all this time. I’m a fairly new PC owner and I have a couple of questions surrounding RTSS and a program called ISLC.

First question: I’m currently using my NVCP resolution that’s set to Ultra HD 4k native (NOT PC resolution 4k) because the LG CX TV and its HDMI 2.0 (UHD 48g).

The reason is because for some reason, the Ultra HD selection in NVCP allows RGB Full and doesn’t give me the ONLY option of YCBCR, which also doesn’t let me select Full dynamic range. I’m guessing because I’m choosing the “TV” resolution.

This is relevant because despite the UHD+ resolution looking better, more colorful and more crisp at the same resolution as the PC selection, the highest refresh rate availble is CAPPED at 60hz instead of the full 120hz I know the LG CX is capable of and can be selected in any PC resolution in the NVCP. (Side Note: what’s even stranger is that certain games seem to take advantage of the 120hz anyway despite my NVCP 60hz being selected from UHD resolution settings, meanwhile others are capped at 60.)

So my actual first question is this: since my NVCP resolution is selected as UHD AND 60hz despite being a 120hz monitor, do I set the RTSS to -3 under THAT refresh rate (which would become 57hz) or still set RTSS to 117hz for -3 because I’m using a 120hz 4ktv/monitor, despite the UHD resolution selection being 60hz in panel?

Secondly, what are your opinions on using something like ISLC or Timer Resolution(?) for decreasing input lag (and in ISLC’s case, input lag and memory) in conjunction with your G-Sync/V-Sync/RTSS instructions here? Necessary? Overkill? Incompatible? I don’t want too many cooks in the kitchen if it’s a problem. It claims to halve my 1ms response time to .5ms, but I’m not sure if there’s any hidden problems this could cause with your method that I should know about. Thank you! I admire your passion for optimal settings.

This is ISLC for reference. https://www.wagnardsoft.com/forums/viewtopic.php?f=18&t=1256&sid=ac940d134fc3d4ffe921578dc23dfb36



I have been having much trouble with my G-sync intermittently working for seemingly no rhyme or reason for months. I have followed all of the recommended steps for setting up Gysnc and have a Gsync certified monitor (Asus TUF VG27A). Gsync WILL work for a few of the games on my computer such as Red Dead Redemption 2 and Rise of the Tomb Raider. I can tell that it is working because of the Gsync enabled badge appearing in the game and a noticeable improvement in tearing. Gsync will not however work in games like Battlefield V or World of Warcraft classic (it previously did when I first built my computer). I figured this may be due to improper in game settings on my end at first. However, when using the Nvidia pendulum demo, I have not once been able to select the Gsync box in the demo. There are no signs of Gsync working at all in the demo and it is accompanied by terrible frame rate tearing. I can select the Vsync option in the demo (if that means anything).
I have done a clean reinstall of the drivers, tried multiple monitors, display port cables, HDMI cables, etc, default Nvidia control pannel settings. Nothing has seemed to work at all! It is incredibly frustrating as I have combed forums and boards for months looking for a solution. It is also upsetting spending the extra money for a certified Gsync monitor and still having the issues. I usually have a dual monitor setup. One monitor is a freesync Acer XF270HUA and the other is the Asus previously mentioned. I have tried testing one monitor at a time with the same issues so I do not believe it to be the fault of having dual monitors.

All settings on nvidia control panel are default besides the following

Set up G-SYNC > Enable G-SYNC, G-SYNC Compatible > Enable for full screen mode.
Manage 3D settings > Max Frame Rate 162 FPS
Manage 3D settings > Vertical sync >ON

Do you have any ideas of what could be causing the issue or inconsistency?

Thank you for your time and help!


The results show a consistent difference between the three methods across most refresh rates (240Hz is nearly equalized in any scenario), with V-SYNC OFF (G-SYNC + V-SYNC “Off,” to a lesser degree) appearing to have a slight edge over G-SYNC + V-SYNC. Why? The answer is tearing…

about this paragraph, you said gsync + vsync off have tearing issues and this is completely wrong if you are in range of gsync, differences between vsync on + gsync and vsync off + gsync is only the plus of input lag vsync offer you


For AMD is it correct to set the framerate limiter through Radeon Chill Min and Max = cap?

And for a game that does have an ingame limiter would it be overkill to be using that as well
as Chill?


Hey there 😀

Do you guys know if there is any difference between gsync and a fixed refresh rate with an unlimited framerate ( lets say 400 fps at 240hz ).