Skip to main content

Nvidia, G-sync, greed

Adaptive vsync is the idea that rather than the monitor having a fixed vertical refresh rate and the graphics card synchronizing to it (if you want to avoid screen tearing), ie. the monitor deciding what the refresh rate is, and the graphics card obeying it, we do it the other way around: The graphics card decides what the refresh rate is, and the monitor synchronizes to it. Which means that the refresh rate can be variable. This means in practice that if at some points the game you are playing drops its rendering speed to eg. 53 frames per second, rather than dropping to 30 FPS because the monitor can't handle anything else, the monitor will now use 53 FPS. Or, in other words, the next rendered frame is always shown as soon as it's ready, rather than the system having to wait for the next monitor vsync to show it. The rendering speed can vary from frame to frame, and it doesn't matter: The image will be shown immediately when it's ready (up to the maximum refresh capacity of the monitor, which is usually 144Hz or more.)

(Note that the term "adaptive vsync" is confusingly also used by Nvidia to refer to a mode where, in monitors not supporting any adaptiveness, vsync will simply be turned off if the rendering speed drops below the monitor's maximum, eg. 60 FPS, and turned on again if the rendering speed reaches or surpasses that. This mode shouldn't be confused with the adaptive vsync supported by monitors, which is what this post is about.)

Obviously adaptive vsync requires hardware support from the monitor. It obviously can't work with old monitors. The monitor needs to explicitly support this. There are currently two competing standards: One by Nvidia, G-sync, and one by AMD, FreeSync. They use rather different technologies, and work a bit differently on the inside, technically. Both are proprietary (even though FreeSync is based on a more open standard) and incompatible with each other: A G-sync monitor won't support an AMD card with FreeSync, and the other way around (unless the monitor supports both technologies.) If the monitor does not support the technology, then you simply get your regular old fixed refresh rate.

Apparently there is little technical reason why eg. Nvidia cards couldn't support both G-sync and FreeSync. They are simply refusing to support the latter, for marketing reasons. And (AFAIK) AMD cards can't support G-sync because it's a highly proprietary technology, and Nvidia doesn't want their competition using their flagship technology.

But that's not the major problem with Nvidia's G-sync, though. No, the major problem is that Nvidia is incredibly greedy with it.

For example, the Acer XG270HU and the Acer XB271HU are pretty much the same monitor, except that the former supports FreeSync, while the latter supports G-sync. Otherwise they have pretty much the same specs. Yet the former costs (as of writing this blog post) £441, or 522€, on Amazon UK, while the latter costs £627, or 742€.

Why such an enormous price difference between two monitors that are otherwise pretty much identical? Because that extra £186 / 220€ goes to Nvidia. They charge about that much from display manufacturers for using their proprietary G-sync technology. Meanwhile AMD (AFAIK) doesn't charge anything for using FreeSync technology.

And if you have an Nvidia graphics card, you pretty much don't have a choice. As mentioned, Nvidia doesn't support FreeSync, so either you pay that significant extra money for a G-sync monitor, or you don't get any adaptive vsync at all.

We are effectively being victims of a monopoly here.

Comments