G-SYNC 101: Range


Blur Buster's G-SYNC 101: Range Chart

Exceeds G-SYNC Range

G-SYNC + V-SYNC “Off”:
G-SYNC disengages, tearing begins display wide, no frame delay is added.

G-SYNC + V-SYNC “On”:
G-SYNC reverts to V-SYNC behavior when it can no longer adjust the refresh rate to the framerate, 2-6 frames (typically 2 frames; approximately an additional 33.2ms @60 Hz, 20ms @100 Hz, 13.8ms @144 Hz, etc) of delay is added as rendered frames begin to over-queue in both buffers, ultimately delaying their appearance on-screen.

G-SYNC + Fast Sync*:
G-SYNC disengages, Fast Sync engages, 0-1 frame of delay is added**.
*Fast Sync is best used with framerates in excess of 2x to 3x that of the display’s maximum refresh rate, as its third buffer selects from the “best” frame to display as the final render; the higher the sample rate, the better it functions. Do note, even at its most optimal, Fast Sync introduces uneven frame pacing, which can manifest as recurring microstutter.
**Refresh rate/framerate ratio dependent (see G-SYNC 101: G-SYNC vs. Fast Sync).

Within G-SYNC Range

Refer to “Upper & Lower Frametime Variances” section below…

Upper & Lower Frametime Variances

G-SYNC + V-SYNC “Off”:
The tearing inside the G-SYNC range with V-SYNC “Off” is caused by sudden frametime variances output by the system, which will vary in severity and frequency depending on both the efficiency of the given game engine, and the system’s ability (or inability) to deliver consistent frametimes.

G-SYNC + V-SYNC “Off” disables the G-SYNC module’s ability to compensate for sudden frametime variances, meaning, instead of aligning the next frame scan to the next scanout (the process that physically draws each frame, pixel by pixel, left to right, top to bottom on-screen), G-SYNC + V-SYNC “Off” will opt to start the next frame scan in the current scanout instead. This results in simultaneous delivery of more than one frame in a single scanout (tearing).

In the Upper FPS range, tearing will be limited to the bottom of the display. In the Lower FPS range (<36) where frametime spikes can occur (see What are Frametime Spikes?), full tearing will begin.

Without frametime compensation, G-SYNC functionality with V-SYNC “Off” is effectively “Adaptive G-SYNC,” and should be avoided for a tear-free experience (see G-SYNC 101: Optimal Settings & Conclusion).

G-SYNC + V-SYNC “On”:
This is how G-SYNC was originally intended to function. Unlike G-SYNC + V-SYNC “Off,” G-SYNC + V-SYNC “On” allows the G-SYNC module to compensate for sudden frametime variances by adhering to the scanout, which ensures the affected frame scan will complete in the current scanout before the next frame scan and scanout begin. This eliminates tearing within the G-SYNC range, in spite of the frametime variances encountered.

Frametime compensation with V-SYNC “On” is performed during the vertical blanking interval (the span between the previous and next frame scan), and, as such, does not delay single frame delivery within the G-SYNC range and is recommended for a tear-free experience (see G-SYNC 101: Optimal Settings & Conclusion).

G-SYNC + Fast Sync:
Upper FPS range: Fast Sync may engage, 1/2 to 1 frame of delay is added.
Lower FPS range: see “V-SYNC ‘On'” above.

What are Frametime Spikes?

Frametime spikes are an abrupt interruption of frames output by the system, and on a capable setup running an efficient game engine, typically occur due to loading screens, shader compilation, background asset streaming, auto saves, network activity, and/or the triggering of a script or physics system, but can also be exacerbated by an incapable setup, inefficient game engine, poor netcode, low RAM/VRAM and page file over usage, misconfigured (or limited game support for) SLI setups, faulty drivers, specific or excess background processes, in-game overlay or input device conflicts, or a combination of them all.

Not to be confused with other performance issues, like framerate slowdown or V-SYNC-induced stutter, frametime spikes manifest as the occasional hitch or pause, and usually last for mere micro to milliseconds at a time (seconds, in the worst of cases), plummeting the framerate to as low as the single digits, and concurrently raising the frametime to upwards of 1000ms before re-normalizing.

G-SYNC eliminates traditional V-SYNC stutter caused below the maximum refresh rate by repeated frames from delayed frame delivery, but frametime spikes still affect G-SYNC, since it can only mirror what the system is outputting. As such, when G-SYNC has nothing new to sync to for a frame or frames at a time, it must repeat the previous frame(s) until the system resumes new frame(s) output, which results in the visible interruption observed as stutter.

The more efficient the game engine, and the more capable the system running it, the less frametime spikes there are (and the shorter they last), but no setup can fully avoid their occurrence.

Minimum Refresh Range

Once the framerate reaches the approximate 36 and below mark, the G-SYNC module begins inserting duplicate refreshes per frame to maintain the panel’s minimum physical refresh rate, keep the display active, and smooth motion perception. If the framerate is at 36, the refresh rate will double to 72 Hz, at 18 frames, it will triple to 54 Hz, and so on. This behavior will continue down to 1 frame per second.

Regardless of the reported framerate and variable refresh rate of the display, the scanout speed will always be a match to the display’s current maximum refresh rate; 16.6ms @60Hz, 10ms @100 Hz, 6.9ms @144 Hz, and so on. G-SYNC’s ability to detach framerate and refresh rate from the scanout speed can have benefits such as faster frame delivery and reduced input lag on high refresh rate displays at lower fixed framerates (see G-SYNC 101: Hidden Benefits of High Refresh Rate G-SYNC).



3696 Comments For “G-SYNC 101”

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Sort by:   newest | oldest | most liked
tearxinnuan
Member
tearxinnuan

Thank you very much for your article and tutorial! I’ve set up the appropriate settings according to your article, but I still have some questions I’d like to ask!

First, my current settings are:
NVCP: G-SYNC + V-SYNC on, LLM off,
In Game: Reflex on + boost, V-SYNC off

I believe this setup is optimal for GSYNC usage. I don’t limit my frame rate using any external software or NVCP. When I enable Reflex in-game, it automatically caps my frame rate at 260 FPS (my monitor is 280Hz). I think relying solely on Reflex to limit my frame rate would be more straightforward than setting it separately, and perhaps also avoid conflicts and instability caused by multiple frame limits. Secondly, I’ve personally tested the games I play, and Reflex takes precedence over both the in-game and NVCP frame limits. That is, no matter how much I limit my frame rate, once Reflex is enabled, it caps it at 260 FPS.

I primarily play competitive games like Valve, APEX, and Overwatch, but I also occasionally play other single-player games. Then, the competitive games I play all have Reflex, so can I completely abandon all external frame limiting methods and rely solely on Reflex?

Also, regarding LLM in NVCP, should I set it on or off, or even set it to Ultra? I’m not sure if there are any advantages or disadvantages to turning LLM on, even though Reflex takes over a lot of the processing. There’s a lot of controversy online about LLM, and even NVIDA officials claim that setting LLM to Ultra will minimize V-SYCN latency.

Looking forward to your answers!

dimacbka
Member
dimacbka

Hi. I really liked this article. But I have a couple of questions. I have a new PC that gives 800 fps in cs2. How do I set up this gsync+vsync+reflex bundle correctly? My monitor is 280Hz. I’m confused, do I need to limit frames via the nvidia panel? Yesterday I turned on “delay” on Ultra and reflex+boost. In the game, the frames were around 260. With the fps_max parameter 0

mike-lesnik
Member

Hello, jorimt! My question is more about input delay than G-sync, but I decided to ask it here because I really like your style of response — simple and clear.
I don’t quite understand what role frametime plays in input delay? It is often written that frametime is the time needed to create a frame, but 60 frames of 16.6 ms each can be created by either an underloaded or overloaded GPU. On the screen, we see the same framerate and frametime in both cases, but the resulting input delay will be different…
That is, the frametime is not “the time it took the system (CPU-OS-Engine-GPU) to create the frame”, but “the time allotted for displaying the frame by the display before the next one appears”?

dpawelcz
Member
dpawelcz

I’m having an awful time trying to get Street Fighter 6 feeling good on my Zephyrus G14 gaming laptop. It has a 120hz OLED screen. I swear in game it doesn’t feel like its getting 120hz, and feels input laggy.
The game is locked at 60fps, and it feels as if its running at 60hz. Outside the game i’ve confirmed im running at 120hz on the display. I have gsync ON and vsync ON in the nvidia control panel. I’ve also noticed that no matter what, sf6 starts with vsync on in the settings and i have to turn it off every time manually. I suspect that might be the issue.
Any tips would be greatly appreciated

anthony3192
Member
anthony3192

When I activate vsync from the nvidia app per profile, therefore from game to game (as you advised me) some games like the witcher, wukong, fortnite are limited to 225fps (I have a 240hz monitor) while other titles like Valorant and Cod have an unlocked frame rate, so they are not limited. How come?

wpDiscuz