G-SYNC 101: Control Panel

G-SYNC Module

The G-SYNC module is a small chip that replaces the display’s standard internal scaler, and contains enough onboard memory to hold and process a single frame at a time.

The module exploits the vertical blanking interval (the span between the previous and next frame scan) to manipulate the display’s internal timings; performing G2G (gray to gray) overdrive calculations to prevent ghosting, and synchronizing the display’s refresh rate to the GPU’s render rate to eliminate tearing, along with the delayed frame delivery and adjoining stutter caused by traditional syncing methods.


The below Blur Busters Test UFO motion test pattern uses motion interpolation techniques to simulate the seamless framerate transitions G-SYNC provides within the refresh rate, when directly compared to standalone V-SYNC.

G-SYNC Activation

“Enable for full screen mode” (exclusive fullscreen functionality only) will automatically engage when a supported display is connected to the GPU. If G-SYNC behavior is suspect or non-functioning, untick the “Enable G-SYNC, G-SYNC Compatible” box, apply, re-tick, and apply.

Blur Buster's G-SYNC 101: Control Panel

G-SYNC Windowed Mode

“Enable for windowed and full screen mode” allows G-SYNC support for windowed and borderless windowed mode. This option was introduced in a 2015 driver update, and by manipulating the DWM (Desktop Windows Manager) framebuffer, enables G-SYNC’s VRR (variable refresh rate) to synchronize to the focused window’s render rate; unfocused windows remain at the desktop’s fixed refresh rate until focused on.

G-SYNC only functions on one window at a time, and thus any unfocused window that contains moving content will appear to stutter or slow down, a reason why a variety of non-gaming applications (popular web browsers among them) include predefined Nvidia profiles that disable G-SYNC support.

Note: this setting may require a game or system restart after application; the “G-SYNC Indicator” (Nvidia Control Panel > Display > G-SYNC Indicator) can be enabled to verify it is working as intended.

G-SYNC Preferred Refresh Rate

“Highest available” automatically engages when G-SYNC is enabled, and overrides the in-game refresh rate selector (if present), defaulting to the highest supported refresh rate of the display. This is useful for games that don’t include a selector, and ensures the display’s native refresh rate is utilized.

“Application-controlled” adheres to the desktop’s current refresh rate, or defers control to games that contain a refresh rate selector.

Note: this setting only applies to games being run in exclusive fullscreen mode. For games being run in borderless or windowed mode, the desktop dictates the refresh rate.


G-SYNC (GPU Synchronization) works on the same principle as double buffer V-SYNC; buffer A begins to render frame A, and upon completion, scans it to the display. Meanwhile, as buffer A finishes scanning its first frame, buffer B begins to render frame B, and upon completion, scans it to the display, repeat.

The primary difference between G-SYNC and V-SYNC is the method in which rendered frames are synchronized. With V-SYNC, the GPU’s render rate is synchronized to the fixed refresh rate of the display. With G-SYNC, the display’s VRR (variable refresh rate) is synchronized to the GPU’s render rate.

Upon its release, G-SYNC’s ability to fall back on fixed refresh rate V-SYNC behavior when exceeding the maximum refresh rate of the display was built-in and non-optional. A 2015 driver update later exposed the option.

This update led to recurring confusion, creating a misconception that G-SYNC and V-SYNC are entirely separate options. However, with G-SYNC enabled, the “Vertical sync” option in the control panel no longer acts as V-SYNC, and actually dictates whether, one, the G-SYNC module compensates for frametime variances output by the system (which prevents tearing at all times. G-SYNC + V-SYNC “Off” disables this behavior; see G-SYNC 101: Range), and two, whether G-SYNC falls back on fixed refresh rate V-SYNC behavior; if V-SYNC is “On,” G-SYNC will revert to V-SYNC behavior above its range, if V-SYNC is “Off,” G-SYNC will disable above its range, and tearing will begin display wide.

Within its range, G-SYNC is the only syncing method active, no matter the V-SYNC “On” or “Off” setting.

Currently, when G-SYNC is enabled, the control panel’s “Vertical sync” entry is automatically engaged to “Use the 3D application setting,” which defers V-SYNC fallback behavior and frametime compensation control to the in-game V-SYNC option. This can be manually overridden by changing the “Vertical sync” entry in the control panel to “Off,” “On,” or “Fast.”

1147 Comments For “G-SYNC 101”

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Sort by:   newest | oldest | most liked
Hattori Hanzo
Hattori Hanzo

Hi, first, a big thank you for all the explanations about Gsync & Vsync. Learnt a lot about it and will test your specifications tonight. (Vsync ON on nvcp, is off now)
Lately i was searching and hoping finding someone with the same issue i got, that is, i notice something wrong about playing Apex legends… sort of input lag while aiming & firing…

My rig is:
Predator XB252Q 240hz
RTX 2080 ventus oc
16gb vengeance ram
i7 9700KF
Corsair Nightsword mouse 1000ms / 18.000 DPI

So i saw some people saying capping -3 fps below max refresh rate (240->237) is the best possible way that Gsync can run, as i read this thread its seems like to be true, but, others say downsizing the FPS to a CONSTANT frametime is the best to do to keep stutter, input lag, tearing etc away…


others say capping to 180 is better because apex cant do more even if you uncap it…

I have Vsync off in both, nvcp and in-game, +fps_max unlimited in apex, RTSS capped to 237fps… but i see variable Frametimes… if i cap to 180 fps it shows me constantly 5.5ms 85-90% of the time… (not on drop lol… there it falls to 120-130fps)

Is there a better specification to do? Maybe enabling Vsync in NCVP or Max rendering framerates in NCVP?…

That said, my initialy question is about frametime – variable or fixed – ? Whats better? Or doesnt matter if gsync & vsync is enabled?…

I only wish to get the best possible specs with my rig to profit maximum at games…

Thanks and sorry for my English, im not a native English speaker…
(i hope this weren’t answered b4… otherwise i missed it… sorry )


Firstly, thanks so much for sharing your hard work. you may have gone over this, but I wanted to be sure. I am using a freesync monitor with gsync + vsync (in control panel) + 138 fps cap in-game. is this the lowest possible input lag I can obtain or would turning g-sync off give me better results? thanks for your time


Hello there! Thanks for share your hard working. I would ask to you a simple question: why in game vsync feels smoother than NVCP vsync in my case? Mainly in CPU dependant games like diablo 3, LoL or Heroes of the storm? Always with Gsync enabled on fullscreen.
My spec are:

Acer XB241H @180hz 1080p
I7 8700k @4700mhz
RTX 2080 stock drivers 451.67
Windows 10 2004 HAGS on.

Thanks for your reply.


Hello there, just a simple question.
Can i enable vsync inside the game instead of forcing it globally in nvidia control panel?
if not, why?

With bf4 for exemple, when vsync is ON in nvcp and off in-game, vsync is not enabled ( when entering the command render.drawscreeninfo it shows vsync = off ),but it works fine on bf1 tho..



Firstly, thank you for the detailed article and the hard work put into testing. It’s certainly helped me have a tear-free, latency-free experience. Second, I have a question–something I’ve been wondering about for a while.

Does the complexity of the rendered scene impact input latency?

To give you more detail, I’m using a 144 Hz G-Sync monitor with my framerate locked to 60 FPS (via RTSS) in this particular game–reason being I rarely get a framearate higher than that, but never drop below it; I try to do this in every game I play. I am able to maintain a consistent framerate and frametime, but I’ve noticed my input latency feels like it increases when I’m looking at a more complex scene, like a vista, rather than, say, looking at my feet. This doesn’t make sense to me, though, because my framerate and frametimes are all unchanging.

Can you offer any insight? Am I just imagining it, or does the complexity actually impact felt latency?