G-SYNC 101: G-SYNC vs. V-SYNC OFF w/FPS Limit

At the Mercy of the Scanout

Now that the FPS limit required for G-SYNC to avoid V-SYNC-level input lag has been established, how does G-SYNC + V-SYNC and G-SYNC + V-SYNC “Off” compare to V-SYNC OFF at the same framerate?

Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings

The results show a consistent difference between the three methods across most refresh rates (240Hz is nearly equalized in any scenario), with V-SYNC OFF (G-SYNC + V-SYNC “Off,” to a lesser degree) appearing to have a slight edge over G-SYNC + V-SYNC. Why? The answer is tearing…

With any vertical synchronization method, the delivery speed of a single, tear-free frame (barring unrelated frame delay caused by many other factors) is ultimately limited by the scanout. As mentioned in G-SYNC 101: Range, The “scanout” is the total time it takes a single frame to be physically drawn, pixel by pixel, left to right, top to bottom on-screen.

With a fixed refresh rate display, both the refresh rate and scanout remain fixed at their maximum, regardless of framerate. With G-SYNC, the refresh rate is matched to the framerate, and while the scanout speed remains fixed, the refresh rate controls how many times the scanout is repeated per second (60 times at 60 FPS/60Hz, 45 times at 45 fps/45Hz, etc), along with the duration of the vertical blanking interval (the span between the previous and next frame scan), where G-SYNC calculates and performs all overdrive and synchronization adjustments from frame to frame.

The scanout speed itself, both on a fixed refresh rate and variable refresh rate display, is dictated by the current maximum refresh rate of the display:

Blur Buster's G-SYNC 101: Scanout Speed DiagramAs the diagram shows, the higher the refresh rate of the display, the faster the scanout speed becomes. This also explains why V-SYNC OFF’s input lag advantage, especially at the same framerate as G-SYNC, is reduced as the refresh rate increases; single frame delivery becomes faster, and V-SYNC OFF has less of an opportunity to defeat the scanout.

V-SYNC OFF can defeat the scanout by starting the scan of the next frame(s) within the previous frame’s scanout anywhere on screen, and at any given time:

Blur Buster's G-SYNC 101: Input Lag & Optimal Settings

This results in simultaneous delivery of more than one frame scan in a single scanout (tearing), but also a reduction in input lag; the amount of which is dictated by the positioning and number of tearline(s), which is further dictated by the refresh rate/sustained framerate ratio (more on this later).

As noted in G-SYNC 101: Range, G-SYNC + VSYNC “Off” (a.k.a. Adaptive G-SYNC) can have a slight input lag reduction over G-SYNC + V-SYNC as well, since it will opt for tearing instead of aligning the next frame scan to the next scanout when sudden frametime variances occur.

To eliminate tearing, G-SYNC + VSYNC is limited to completing a single frame scan per scanout, and it must follow the scanout from top to bottom, without exception. On paper, this can give the impression that G-SYNC + V-SYNC has an increase in latency over the other two methods. However, the delivery of a single, complete frame with G-SYNC + V-SYNC is actually the lowest possible, or neutral speed, and the advantage seen with V-SYNC OFF is the negative reduction in delivery speed, due to its ability to defeat the scanout.

Bottom-line, within its range, G-SYNC + V-SYNC delivers single, tear-free frames to the display the fastest the scanout allows; any faster, and tearing would be introduced.

1796 Comments For “G-SYNC 101”

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Sort by:   newest | oldest | most liked

Hello, is it true that the monitor with the GSYNC module contains GPU scaling?

And is it also true that GSYNC module monitor cannot use display scaling?

If GPU scaling is forced on the GSYNC module monitor. GPU scaling is used as “no scaling” in the NVIDIA control panel, and if the resolution and direct resolution of the game are the same, it doesn’t work anyway?

Also, is it the same story for GSYNC compatible monitors or only GSYNC module monitors?


Please take a look at this, we may have a solution for the windowed mode nightmare that doesn’t require owning an adaptive sync display!!! https://steamcommunity.com/app/993090/discussions/0/3053989078739351832/?ctp=4



First I like to thank you for this awesome guide. I’m sure you helped tens of thousands of people to get the best out of their g-sync/freesync screen.

I do have a question no one else could anwser. Maybe you know what is going on with my system and if it is related to g-sync/freesync.

My current settings:

  • 144hz screen (LG 27GL850 Ultragear, G-sync compatible)
  • G-sync (fullscreen mode) on
  • V-sync on in NVCP
  • Framelimit 140 in RTSS
  • Core park on
  • Turned off vsync ingame and any buffers if they are available
  • Low latency mode is set to “on”.
  • Basically, exactly what you mention as the optimal settings. I’m not using any overlays like g-force exp. and RTSS.

    My problem:

    I’m expieriencing some kind of (micro)stuttering when my frames drop below 140. But it’s application dependent. For example:

    When i’m playing Battlefield V on the smaller snowmap (without the train) I constantly get 140 fps. This map runs butter smooth as I would expect with free-sync. Other smaller maps also run butter smooth. When playing the bigger snow map (with the train) and fps sometimes dips below 140 (like 135) i’m expierincing some kind of stuttering. It’s just not smooth like the smaller maps. Now this confuses me because it’s only dipping 5 frames or so below my FPS cap. But the difference for me is night and day. I have the same expierence on all maps in BF V where the frames drop just a bit below my limit.

    When I run the Timespy benchmark, everything is butter smooth. But i’m no where near my fps cap. In the first demo the FPS is going all over the place dipping to 60 and all the way up to 120 but everything looks butter smooth.

    When playing Sea of Thieves I have the same problem as BF 5. When FPS goes down the smoothness is gone. Eventhough FPS is above 100.

    It’s definitely not a hardware problem (unless my screen is the problem). I’ve had the same issue with completely other hardware. I already have a new cpu, gpu and ram. Same issue’s.

    To give you an idea what i’m expierencing:

    I did expierence the same thing when I was still using a 60hz screen + vsync. If FPS dropped below 60, even if it’s 1 frame, the smoothness is gone. That would be expected with a 60hz + vsync but I would not expect this with free-sync only dropping a few frames and way above 130 fps.

    The strange thing is, this does not always happen. Like the timespy example, everything is butter smooth but the FPS is not even near the fps cap. So, free-sync is working just fine.

    Do you have any idea what i’m facing here? Could it be CPU related because BF5 is very CPU heavy and basically always hits 100% cpu use ( i’m running the 5800x). But some cores are actually running 100% on the small now map (I checked) but on that map everything is butter smooth.


    Hi, I’m here for an advice and I hope I’ll get one from smart people. I bought 165hz monitor with g-sync and I want to use it but I really do care about my input delay so I want to know what option is going to be the best in my situation:

    1. 165 hz monitor + 162 in game fps lock + g-sync on + v-sync on
    2. 165 hz monitor + 162 in game fps lock + g-sync on + v-sync off
    3. 165 hz monitor + 180 in game fps lock + g-sync off + v-sync off

    So the questions are:

    • Does the v-sync even make sense with g-sync and will I get less input delay by turning v-sync on or by turning it off (while with g-sync)?

    • Should I even use g-sync? Is there a big difference in the input delay between the option 1 and option 3 in my list?

    I hope you got my questions right and will help me))


    Beautiful, this is like the holy grail when it comes to G-Sync information, thanks.

    I just want to ask over to be sure, since I recently switched to a Nvidia GPU with a G-Sync Compatible display and reading the comments I saw some conflicting comments too, specially since late driver updates seems to have changed a bunch of things (for example no more display selection under G-SYNC tab, but just says “Set the G-SYNC capable display as the primary display”).

    So for the best results, all I have to do in Nvidia Control Panel is:

    ● G-Sync: “ON” (Fullscreen) in the G-Sync tab and Monitor Technology set to “G-Sync” too?
    ● Vertical Synchronization: “V-Sync ON” (not Fast Sync)
    ● Low Latency Mode: “ULTRA”
    ● Max Frame Rate: “141” (my monitor is 144Hz)
    ● In game: V-Sync or related settings all OFF

    If someone can quickly glance over this and say if it’s correct, thanks!