G-SYNC 101: Closing FAQ


“Let me get this straight…”

“Closing FAQ” published Jan 25, 2019

G-SYNC 101 has been around long enough now to have accumulated some frequently asked follow-up questions from readers, both in the comments sections here and in the Blur Busters Forums.

To avoid further confusion or repeat questions, I have compiled a selection of them below for easier access. This section may change and grow as time goes by, so check back here regularly for updates before asking your question.



3702 Comments For “G-SYNC 101”

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Sort by:   newest | oldest | most liked
barry12345
Member
barry12345

Is there any explanation why my CS2 is capping at 200FPS with a 240Hz monitor?

Using Gsync + Vsync (nvcp) + reflex

With Vsync off it will uncap and I get around 300FPS so it should be able to cap at 225 like it does for everyone else?

With Vsync on the FPS is still capped at 200 even with a 225 limit set in nvcp.

barry12345
Member
barry12345

Overwatch with these settings caps at 239 instead of 225.

ItapL
Member
ItapL

In the Windows graphics settings, there’s an option for “Variable Refresh Rate.” Should I turn this on? (I’m not using an English version of Windows, so I’m not sure what the exact label says.)

My monitor has a real-time refresh rate display, but when I enable both G-SYNC and V-SYNC, it always shows the maximum refresh rate of 320 Hz in games — it doesn’t change dynamically. My G-SYNC settings are configured correctly, and V-SYNC is forced on through NVIDIA Profile Inspector.

tearxinnuan
Member
tearxinnuan

Thank you very much for your article and tutorial! I’ve set up the appropriate settings according to your article, but I still have some questions I’d like to ask!

First, my current settings are:
NVCP: G-SYNC + V-SYNC on, LLM off,
In Game: Reflex on + boost, V-SYNC off

I believe this setup is optimal for GSYNC usage. I don’t limit my frame rate using any external software or NVCP. When I enable Reflex in-game, it automatically caps my frame rate at 260 FPS (my monitor is 280Hz). I think relying solely on Reflex to limit my frame rate would be more straightforward than setting it separately, and perhaps also avoid conflicts and instability caused by multiple frame limits. Secondly, I’ve personally tested the games I play, and Reflex takes precedence over both the in-game and NVCP frame limits. That is, no matter how much I limit my frame rate, once Reflex is enabled, it caps it at 260 FPS.

I primarily play competitive games like Valve, APEX, and Overwatch, but I also occasionally play other single-player games. Then, the competitive games I play all have Reflex, so can I completely abandon all external frame limiting methods and rely solely on Reflex?

Also, regarding LLM in NVCP, should I set it on or off, or even set it to Ultra? I’m not sure if there are any advantages or disadvantages to turning LLM on, even though Reflex takes over a lot of the processing. There’s a lot of controversy online about LLM, and even NVIDA officials claim that setting LLM to Ultra will minimize V-SYCN latency.

Looking forward to your answers!

dimacbka
Member
dimacbka

Hi. I really liked this article. But I have a couple of questions. I have a new PC that gives 800 fps in cs2. How do I set up this gsync+vsync+reflex bundle correctly? My monitor is 280Hz. I’m confused, do I need to limit frames via the nvidia panel? Yesterday I turned on “delay” on Ultra and reflex+boost. In the game, the frames were around 260. With the fps_max parameter 0

mike-lesnik
Member

Hello, jorimt! My question is more about input delay than G-sync, but I decided to ask it here because I really like your style of response — simple and clear.
I don’t quite understand what role frametime plays in input delay? It is often written that frametime is the time needed to create a frame, but 60 frames of 16.6 ms each can be created by either an underloaded or overloaded GPU. On the screen, we see the same framerate and frametime in both cases, but the resulting input delay will be different…
That is, the frametime is not “the time it took the system (CPU-OS-Engine-GPU) to create the frame”, but “the time allotted for displaying the frame by the display before the next one appears”?

wpDiscuz