G-SYNC 101: In-game vs. External FPS Limiters


Closer to the Source*

*As of Nvidia driver version 441.87, Nvidia has made an official framerate limiting method available in the NVCP; labeled “Max Frame Rate,” it is a CPU-level FPS limiter, and as such, is comparable to the RTSS framerate limiter in both frametime performance and added delay. The Nvidia framerate limiting solutions tested below are legacy, and their results do not apply to the “Max Frame Rate” limiter.

Up until this point, an in-game framerate limiter has been used exclusively to test FPS-limited scenarios. However, in-game framerate limiters aren’t available in every game, and while they aren’t required for games where the framerate can’t meet or exceed the maximum refresh rate, if the system can sustain the framerate above the refresh rate, and a said option isn’t present, an external framerate limiter must be used to prevent V-SYNC-level input lag instead.

In-game framerate limiters, being at the game’s engine-level, are almost always free of additional latency, as they can regulate frames at the source. External framerate limiters, on the other hand, must intercept frames further down the rendering chain, which can result in delayed frame delivery and additional input latency; how much depends on the limiter and its implementation.

RTSS is a CPU-level FPS limiter, which is the closest an external method can get to the engine-level of an in-game limiter. In my initial input lag tests on my original thread, RTSS appeared to introduce no additional delay when used with G-SYNC. However, it was later discovered disabling CS:GO’s “Multicore Rendering” setting, which runs the game on a single CPU-core, caused the discrepancy, and once enabled, RTSS introduced the expected 1 frame of delay.

Seeing as the CS:GO still uses DX9, and is a native single-core performer, I opted to test the more modern “Overwatch” this time around, which uses DX11, and features native multi-threaded/multi-core support. Will RTSS behave the same way in a native multi-core game?

Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings

Yes, RTSS still introduces up to 1 frame of delay, regardless of the syncing method, or lack thereof, used. To prove that a -2 FPS limit was enough to avoid the G-SYNC ceiling, a -10 FPS limit was tested with no improvement. The V-SYNC scenario also shows RTSS delay stacks with other types of delay, retaining the FPS-limited V-SYNC’s 1/2 to 1 frame of accumulative delay.

Next up is Nvidia’s FPS limiter, which can be accessed via the third-party “Nvidia Inspector.” Unlike RTSS, it is a driver-level limiter, one further step removed from engine-level. My original tests showed the Nvidia limiter introduced 2 frames of delay across V-SYNC OFF, V-SYNC, and G-SYNC scenarios.

Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings

Yet again, the results for V-SYNC and V-SYNC OFF (“Use the 3D application setting” + in-game V-SYNC disabled) show standard, out-of-the-box usage of both Nvidia’s v1 and v2 FPS limiter introduce the expected 2 frames of delay. The limiter’s impact on G-SYNC appears to be particularly unforgiving, with a 2 to 3 1/2 frame delay due to an increase in maximums at -2 FPS compared to -10 FPS, meaning -2 FPS with this limiter may not be enough to keep it below the G-SYNC ceiling at all times, and it might be worsened by the Nvidia limiter’s own frame pacing behavior’s effect on G-SYNC functionality.

Needless to say, even if an in-game framerate limiter isn’t available, RTSS only introduces up to 1 frame of delay, which is still preferable to the 2+ frame delay added by Nvidia’s limiter with G-SYNC enabled, and a far superior alternative to the 2-6 frame delay added by uncapped G-SYNC.



1620 Comments For “G-SYNC 101”

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Sort by:   newest | oldest | most liked
BlurDawg
Member
BlurDawg

Hello again Jorimt, hope you’re having a great day.

I have a question less related to gsync and more to do with render latency. I’ve noticed as I turn the resolution down, the render latency reduces even at the same fps. In fact I have found that @60fps capped the render latency is around 11.5ms at 1080p compared to 16-18ms latency when capped at 60fps 1440p. At 720p the render latency dropped even further.

As far as I understand, render latency is linked to input latency. Does this mean that for example with console games that are capped at 60fps @4k, the input latency is much higher than it could be if the game was instead rendered at 720p? Since the GPU is doing less work per frame?

AidenJr
Member
AidenJr

Hi, this was exactly what I was looking for, but I still have one unanswered question in my mind.
I’m going to buy a 165hz G-sync monitor, and the game that I play runs around 200 fps. Will I necessarily get screen tear if I don’t cap at 162 ? and do you suggest capping the fps at 162, or playing it on higher graphic settings of the game to stay below the monitor’s refresh rate? like 150 fps or so. Since I play FPS games input lag really matters to me.
Regards

Zole
Member
Zole

After reinstalling windows recently, my G-Sync behaviour has changed.

As far as I can remember, my usual set up was:
League of legends played in Windowed Borderless Mode.
v-sync: disabled – in-game
v-sync: enabled (set to “Fast”) – in NVCP
g-sync: enabled for both windowed and full-screen – in NVCP
preferred refresh rate: Highest available – in NVCP
power management: Prefer maximum performance – in NVCP
Monitor technology: G-SYNC – in NVCP
frame rate: uncapped – in game

With these settings, my frame rate was capped by the fast v-sync to 1 frame below my monitors max refresh rate 164 (down from 165).

However, after reinstalling windows and reapplying these same settings the frame rate is no longer capped to 1 below the monitors refresh rate. Instead I get FPS anywhere from 200-600 and I notice stutters and tearing.

Is there any way for me to get back my previous system behaviour?

PS I know the recommended way to set up a system is for full-screen g-sync but I prefer windowed borderless for the rapid alt tabbing, as I do that frequently.

brdon209
Member
brdon209

I am looking to buy a 1440p 144hz monitor and from what I have read, I should set my FPS at 141 using nvcp, set vsync on in nvcp and off in game. However, many sources have told me to turn on null on and others have told me to completely turn it off. Should I put it on ultra, on or off? I play league of legends fyi and my cpu is usually around 16% usage and my gpu is around 30% uncapped 1080p. Sorry, I am new to th this subject .

georgi74
Member
georgi74

I have a System with a Geforce 3080 and as display I use a LG c9 OLED TV with 120Hz and GSync.

In Nioh 2, which can easily perform over 120 fps with DLSS on 4k with that System if I limit the game to 117 fps with RTSS I get Micro Stutter when just looking around. If I disable the RTSS limiter and let the game limit the fps to 120 (in game there is only an option to lock at 30/60/120) the game is buttery smooth. Is there any way to get the game running smooth with a limit to 117 fps?

wpDiscuz