NVIDIA G-SYNC: Variable Refresh Rate Monitors!

One small step closer to tomorrow’s Holodeck:Β NVIDIA G-SYNC! It is a technique for refreshing computer monitors at variable intervals (up to a certain limit).Β Instead of refreshing monitors at exact intervals, the monitor is refreshed when the GPU finishes generating a frame! [Image: AnandTech liveblog of launch event]

Variable refresh rates combines advantages of VSYNC ON (eliminate tearing) with the advantages of VSYNC OFF (low input lag), while virtually eliminating stutters (no mismatch between frame rate and refresh rate).

It is confirmed that this, alone, won’t eliminate motion blur as completely as strobe backlights (e.g. LightBoost), but this is a great step towards eliminating discrete refresh rates (which creates motion blur even at 144Hz. See photos of 60Hz vs 120Hz vs LightBoost), especially as the 144Hz limit is raised in the future, while using nVidia G-SYNC. Hopefully strobe backlight technologies can be combined with G-SYNC in the future — and hopefully already in some upcoming models.

EDIT:
– All G-SYNC monitors include an official strobe backlight mode, better than LightBoost!
– Mark Rejhon has quickly come up with a new method of dynamically blending PWM-free backlight at lower framerates to strobing at higher framerates; see addendum to Electronics Hacking: Creating a Strobe Backlight. This allows combining LightBoost + G-SYNC without creating flicker during lower framerates!

 


About Mark Rejhon

Also known as Chief Blur Buster. Founder of Blur Busters. Inventor of TestUFO. Read more about him on the About Mark page.

39 Comments For “NVIDIA G-SYNC: Variable Refresh Rate Monitors!”

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Sort by:   newest | oldest | most liked
snakeeyes
Member
snakeeyes

hey i’ve got a question, can you use both modes G synic and strobe modes at the same time ? cause that’s the only reasen that would make me dump amd’s mantle and true audio for a cheaper price.

snakeeyes
Member
snakeeyes

and get a 4k monitor ofc instead of all this

blargg
Member

kirkh, couldn’t it just lengthen the next strobe? If the variation in frame duration was too great, it could insert an extra frame, rather than have brightness fall a lot, then raise a lot during the next few frames to compensate. This would only occur when the program was varying the rate drastically.

kirkh
Member
kirkh

Wouldn’t you need to know the duration of the current frame to adjust strobing appropriately? And if so doesn’t that imply at least one frame of latency? I think your solution works, curious to see what Nvidia has cooked up, maybe its the same thing.

wpDiscuz

Recent Content