NVIDIA G-SYNC technology is almost forgotten, why?

The NVIDIA G-SYNC technology it was quite a revolution by bringing the concept of variable refresh rate to the PC to eliminate certain artifacts and image errors. However, today this technology is seen less and less on monitors and everything indicates that over time it could quietly disappear from the market. Why will we stop seeing monitors with such technology over time?

When G-SYNC technology came on the market, the vast majority of monitors lacked variable refresh rate technologies, which is an understatement for saying that the graphics card communicates with the monitor to control the monitor’s vertical sync. Such capability was not contemplated in any video standard until then, but it was a good slap on the wrist for the industry. The first to take note, obviously, was VESA, which created the Adaptive Sync standard, forcing NVIDIA to improve its own and AMD to use it under a new name, FreeSync.

This is how NVIDIA’s G-SYNC is dying

One of NVIDIA’s key points with G-SYNC is that it requires additional hardware to work, which has to be built into the monitor and isn’t exactly cheap. Although it provides better features than Adaptive Sync and its derivatives, this is where the so-called good enough ends up taking over and making most users not want to pay a considerable extra for the solution from Jen Hsen Huang’s brand.

This board that you see in the image above consists of an Intel FPGA and a series of DDR memory support. There is no G-SYNC chip for it because the volume is low enough that you can’t have a much cheaper ASIC to do the job. That is, the number of users who buy a monitor with support for said technology is not enough to justify a processor for said monitors.

And how much can it cost? Well, they are not sold directly, but from 500 euros in large volumes they do not go down. The funny thing is that an ASIC would cost 10 times less than the whole set. If not, the functionality has already been integrated into DisplayPort ports for years and any graphics display controller that supports it can already support Adaptive Sync at no additional cost, since the process is integrated into the transmitter itself and video receiver. Add to that the existence of VRR support in HDMI 2.1 and you already have the final nail in the coffin.

Don’t worry about me, I’m already dead

One of NVIDIA’s commercial traps is G-SYNC compatible, which refers to graphics cards intended for use with monitors with Adaptive Sync. At the end of the day, they are looking to solve the same problem and it is the only way they can perpetuate their brand, at least until people’s interest disappears completely or is low enough.

Response time 360Hz

That is to say, the vast majority of screens that we see in stores, despite having the initials on the box or on the casing, do not use specialized circuitry and are simple monitors with DisplayPort connections. Which means saving money for manufacturers and not having to support a monitor outside of the market price, just because it has the famous green eye mark.