G-SYNC 101: Range


Blur Buster's G-SYNC 101: Range Chart

Exceeds G-SYNC Range

G-SYNC + V-SYNC “Off”:
G-SYNC disengages, tearing begins display wide, no frame delay is added.

G-SYNC + V-SYNC “On”:
G-SYNC reverts to V-SYNC behavior when it can no longer adjust the refresh rate to the framerate, 2-6 frames (typically 2 frames; approximately an additional 33.2ms @60 Hz, 20ms @100 Hz, 13.8ms @144 Hz, etc) of delay is added as rendered frames begin to over-queue in both buffers, ultimately delaying their appearance on-screen.

G-SYNC + Fast Sync*:
G-SYNC disengages, Fast Sync engages, 0-1 frame of delay is added**.
*Fast Sync is best used with framerates in excess of 2x to 3x that of the display’s maximum refresh rate, as its third buffer selects from the “best” frame to display as the final render; the higher the sample rate, the better it functions. Do note, even at its most optimal, Fast Sync introduces uneven frame pacing, which can manifest as recurring microstutter.
**Refresh rate/framerate ratio dependent (see G-SYNC 101: G-SYNC vs. Fast Sync).

Within G-SYNC Range

Refer to “Upper & Lower Frametime Variances” section below…

Upper & Lower Frametime Variances

G-SYNC + V-SYNC “Off”:
The tearing inside the G-SYNC range with V-SYNC “Off” is caused by sudden frametime variances output by the system, which will vary in severity and frequency depending on both the efficiency of the given game engine, and the system’s ability (or inability) to deliver consistent frametimes.

G-SYNC + V-SYNC “Off” disables the G-SYNC module’s ability to compensate for sudden frametime variances, meaning, instead of aligning the next frame scan to the next scanout (the process that physically draws each frame, pixel by pixel, left to right, top to bottom on-screen), G-SYNC + V-SYNC “Off” will opt to start the next frame scan in the current scanout instead. This results in simultaneous delivery of more than one frame in a single scanout (tearing).

In the Upper FPS range, tearing will be limited to the bottom of the display. In the Lower FPS range (<36) where frametime spikes can occur (see What are Frametime Spikes?), full tearing will begin.

Without frametime compensation, G-SYNC functionality with V-SYNC “Off” is effectively “Adaptive G-SYNC,” and should be avoided for a tear-free experience (see G-SYNC 101: Optimal Settings & Conclusion).

G-SYNC + V-SYNC “On”:
This is how G-SYNC was originally intended to function. Unlike G-SYNC + V-SYNC “Off,” G-SYNC + V-SYNC “On” allows the G-SYNC module to compensate for sudden frametime variances by adhering to the scanout, which ensures the affected frame scan will complete in the current scanout before the next frame scan and scanout begin. This eliminates tearing within the G-SYNC range, in spite of the frametime variances encountered.

Frametime compensation with V-SYNC “On” is performed during the vertical blanking interval (the span between the previous and next frame scan), and, as such, does not delay single frame delivery within the G-SYNC range and is recommended for a tear-free experience (see G-SYNC 101: Optimal Settings & Conclusion).

G-SYNC + Fast Sync:
Upper FPS range: Fast Sync may engage, 1/2 to 1 frame of delay is added.
Lower FPS range: see “V-SYNC ‘On'” above.

What are Frametime Spikes?

Frametime spikes are an abrupt interruption of frames output by the system, and on a capable setup running an efficient game engine, typically occur due to loading screens, shader compilation, background asset streaming, auto saves, network activity, and/or the triggering of a script or physics system, but can also be exacerbated by an incapable setup, inefficient game engine, poor netcode, low RAM/VRAM and page file over usage, misconfigured (or limited game support for) SLI setups, faulty drivers, specific or excess background processes, in-game overlay or input device conflicts, or a combination of them all.

Not to be confused with other performance issues, like framerate slowdown or V-SYNC-induced stutter, frametime spikes manifest as the occasional hitch or pause, and usually last for mere micro to milliseconds at a time (seconds, in the worst of cases), plummeting the framerate to as low as the single digits, and concurrently raising the frametime to upwards of 1000ms before re-normalizing.

G-SYNC eliminates traditional V-SYNC stutter caused below the maximum refresh rate by repeated frames from delayed frame delivery, but frametime spikes still affect G-SYNC, since it can only mirror what the system is outputting. As such, when G-SYNC has nothing new to sync to for a frame or frames at a time, it must repeat the previous frame(s) until the system resumes new frame(s) output, which results in the visible interruption observed as stutter.

The more efficient the game engine, and the more capable the system running it, the less frametime spikes there are (and the shorter they last), but no setup can fully avoid their occurrence.

Minimum Refresh Range

Once the framerate reaches the approximate 36 and below mark, the G-SYNC module begins inserting duplicate refreshes per frame to maintain the panel’s minimum physical refresh rate, keep the display active, and smooth motion perception. If the framerate is at 36, the refresh rate will double to 72 Hz, at 18 frames, it will triple to 54 Hz, and so on. This behavior will continue down to 1 frame per second.

Regardless of the reported framerate and variable refresh rate of the display, the scanout speed will always be a match to the display’s current maximum refresh rate; 16.6ms @60Hz, 10ms @100 Hz, 6.9ms @144 Hz, and so on. G-SYNC’s ability to detach framerate and refresh rate from the scanout speed can have benefits such as faster frame delivery and reduced input lag on high refresh rate displays at lower fixed framerates (see G-SYNC 101: Hidden Benefits of High Refresh Rate G-SYNC).



3006 Comments For “G-SYNC 101”

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Sort by:   newest | oldest | most liked
dandyjr
Member
dandyjr

Hey Jorimt, I have a question regarding the issue of avoiding the G-Sync ceiling.

I recently purchased a 280Hz Freesync monitor (it’s not official G-Sync Compatible but after testing in multiple games, it seems to mirror my 144Hz officially G-Sync Compatible monitor in accuracy) and I’ve noticed an issue that has happened on previous monitors I’ve owned as well.

This monitor doesn’t have an OSD that I can have toggled on at all times so I had to manually open the menu to check each time but I’ve noticed that there are multiple moments where the refresh rate will read as 280Hz instead of whatever framerate the game is reading as at the time.

For example, I tried with in-game and external limiters, I can cap the frames at 240fps and the game will read as 240fps but when I open the menu of the monitor, it reads as 280Hz for that moment (causing me to believe that G-Sync has disabled in that moment and Vsync has toggled on). Then I’ll close the menu and reopen it and then the refresh rate will read as some variation under the 280Hz ceiling. That lets me know that G-Sync does engage, but there are moments where it’s not engaging even when it should.

I tested this in multiple games with multiple forms of framerate caps and noticed the same trend. The closer I capped to 280Hz, the more times I would see 280Hz in the monitor. The only way to stop it from happening was to cap the framerate far below the ceiling. Capping at 277fps, for example, in RTSS caused the 280Hz readout to never change at all (which would indicate that G-Sync was not engaging) which caused me a lot of frustration.

I would think that it’s possibly due to the monitor not being offically G-Sync Compatible but the same issue would happen with my 144Hz monitor as well (with lower framerates of course because the ceiling is lower) and that monitor had an OSD that I could leave toggled on. I would see the numbers rapidly change and by watching very closely, you could see the 144Hz flash multiple times within the mixture.

Am I overthinking this or are the monitors actually reading correctly and G-Sync is disengaging and re-engaging constantly even with framerate caps below the ceiling?

It’s sad to think that the 280Hz ceiling is useless because framerates need to be capped far below the ceiling even with external limiters that appear to be perfect in execution.

TkoSeven
Member
TkoSeven

Thanks for the wonderful article.

2 questions!

Adjust desktop size and position section,
“Perform scaling on: Display or GPU” (also override the scaling mode set by games and programs)
does it matter in terms of how g-sync monitor
interacts with GPU?

2nd question on “Max frame rate” on Nvidia settings,
if a game was designed to be locked at 60fps, like Tekken 8,
Nvidia panel set to 58 fps, does it work by limiting frame data transferred to display even though
the logic of the game (application) actually went through generating data for 60 frames?
or
does it actually limit the game to only generate 58 frames?

thank you in advance.

eeayree
Member
eeayree

Hi jorimt. I myself am from another country and therefore I hope that the translator will do his job correctly. Now I’m playing Metro 2033 and this game can produce from 80 to 120 fps on my system and at the same time the GPU is not working at full capacity. It works within 85 percent. Your tuning guide states that it is best to enable low latency mode in cases where the frame rate does not always reach or exceed the refresh rate. Am I doing the right thing if I leave low latency mode on if the video card is not loading at 99%? And at what percentage values ​​does the delay appear exclusively at 99%? On 95, 96, 97 will everything work with minimal delays?

SovonHalder
Member
SovonHalder

HI jorimt, I can’t thank you enough for writing this article. I read pages 1 through 15 multiple times to understand as much as I cound and i’t’s incredibly useful for folks like me who are new to PC gaming. Just wanted to acknowledge that adn say many thanks brother.

So I set up things for my 144hz 32GR93U exactly as you described: In nvcp Gsync+Vsync+Low latency ultra and the in game Vsync, double or triple buffering off and fullscreen on. I have 2 questions basically.

1. In most games I’ve tried so far like alan wake 2, god of war, stray the fps is capped automatically to 138 as it should but in red dead online (vulkan) it maxes out at 144. I use rtss to set it custom 138 but the question remains. Why doesn’t it happen automatically?

2. The new nvidia app reports an Average PC Latency in their overlay. For me above 60ms with mouse and controller feels sluggish whereas when it’s below 30ms the game feels much more responsive. I want to implement that on my RTSS stat overlay. (I find rtss overlay more robust and feature rich and I can see frametime grap and nvidia overlay is finicky and slow.) Could you share some insight as to how they are calculating that number?

Ezi
Member
Ezi

I have a question, so i did the settings you suggested and my gpu usage went from 99% usage to hovering around 70-88ish. Is this normal? Cause if i were to default everything back, the usage would go back to 99%

these are the settings i have in NCP:
– vsync on in NCP, off ingame
– set fps limiter -3
– gsync enabled – full screen
– LLM set to ultra since ingame doesnt have a limiter

my pc spec is:
– 4080 super FE
– 7800x3D
– M32U 4k 144hz monitor

wpDiscuz