GSync/gsync ultimate

WilyKit

Gawd
Joined
Dec 18, 2020
Messages
764
What’s the deal with this technology in current times? With both amd and nvidia now supporting adaptive sync (free sync) what’s the benefit of shelling out $$$ for gsync Equipped monitor?
 
Slightly lower latency is about it nowadays. Also, freesync isn't the same thing as adaptive sync.
 
Variable overdrive is pretty much the only benefit it has anymore. Its feature set is outdated and it doesn't seem likely Nvidia is going to make one with e.g HDMI 2.1 or DP 2.1
 
Slightly lower latency is about it nowadays. Also, freesync isn't the same thing as adaptive sync.

Glad you mentioned that.

I’ve got an Alienware 34” QD OLED with gsync ultimate. Dell recently released the freesync (or non gsync anyway) version of this display. HUB reviewed both and at least with these two, found the non gsync model having significantly lower latency when you also account for input lag.

It was actually this review that prompted me to ask the question.
 
What’s the deal with this technology in current times? With both amd and nvidia now supporting adaptive sync (free sync) what’s the benefit of shelling out $$$ for gsync Equipped monitor?
With GSU you get HDR EOTF Tracking suited for the panel for gaming. For example AW3423DW vs AW3423DWF - the latter has much worse EOTF Tracking (to be fixed by DELL though), or X32 FP vs older ones like X27 - X27 has much better looking image in games that are not the best at HDR calibration.
 
Every gsync and gsync compatible monitor is certified by NVIDIA through a rigorous testing process. So you know it's actually good. That's really the only benefit now days.

A non gsync monitor could be just as good.


Way back before adaptive sync was a thing the purpose of gsync was to make it a thing. Because monitor manufacturers were not doing it on their own.

The first gsync "monitor" was actually a DIY kit you bought from Nvidia and used to replace the internals of a monitor you bought separately.

Then NVIDIA partnered with monitor makers to ship with NVIDIA's gsync module.

And now monitor makers just build that capability into their own electronics.
 
What’s the deal with this technology in current times? With both amd and nvidia now supporting adaptive sync (free sync) what’s the benefit of shelling out $$$ for gsync Equipped monitor?
Though Nvidia already lowers the standard of Gsync Ultimate from true HDR 1000 to any HDR, G-sync module is still the most powerful processing unit to handle graphics, backlight in various way, not just latency.

Most people don't even know to eliminate screen tear Gsync needs Vsync at the same time.

There are all kinds of bugs on monitors without a Gsync module because they have a worse unit on it.
 
FWIW, I recently got an LG 38GN950-B G-Sync Compatible monitor to replace my aging Acer Predator X34P with Native G-Sync. The native G-Sync X34P has less ghosting/overshoot at the lower refresh rate ranges and looks smoother to my eyes in the G-sync pendulum demo with the simulated 40-60fps test.

Honestly, I didn't mind those differences but I had to return the LG anyway because there was a stuck green pixel smack dab in the middle of the panel. Once I noticed it, it just could not be unseen.
 
There’s a real mixed bad with non certified monitors. There’s no reason a non certified monitor can’t be good, but I read reviews from a trusted source carefully and know what my must haves are so I don’t have to go through the hassle of returning.
Even with gsync ultimate there may be some problems but you know there are some categories of issues you don’t have to worry about.
 
It matters much less than it used to. The main thing I've noticed is that Gsync ultimate or whatever they want to call the ones with the actual module just work and work well at all frequencies. Gsync compatible ones CAN work well, but I've seen a few cases where they have issues. When nVidia has their hardware on the monitor and in the computer, they seem to be able to make it work flawlessly.

As for things like latency, for the most part that seems like something not real worth worrying about these days with a good screen. Maybe if you are a top-tier pro gamer, but all-in-all the good displays are very low, be they gsync or freesync. It doesn't seem to matter enough to get worked up over.

In terms of what a gsync module actually IS, it is a FPGA and memory in the monitor with nVidia's programming on it. Normally a monitor is going to use some kind of ASIC that handles things like interfaces, scaling, VRR, all that sort of thing. Mediatek makes some real popular ones. nVidia designed their own, but they don't make enough of them to warrant making an ASIC, so they just do it with an FPGA.

I suspect it'll go away in the long run. When it was introduced Gsync was the only way to do VRR, period. Now VRR support is getting super common and works pretty well with most things.
 
It matters much less than it used to. The main thing I've noticed is that Gsync ultimate or whatever they want to call the ones with the actual module just work and work well at all frequencies. Gsync compatible ones CAN work well, but I've seen a few cases where they have issues. When nVidia has their hardware on the monitor and in the computer, they seem to be able to make it work flawlessly.

As for things like latency, for the most part that seems like something not real worth worrying about these days with a good screen. Maybe if you are a top-tier pro gamer, but all-in-all the good displays are very low, be they gsync or freesync. It doesn't seem to matter enough to get worked up over.

In terms of what a gsync module actually IS, it is a FPGA and memory in the monitor with nVidia's programming on it. Normally a monitor is going to use some kind of ASIC that handles things like interfaces, scaling, VRR, all that sort of thing. Mediatek makes some real popular ones. nVidia designed their own, but they don't make enough of them to warrant making an ASIC, so they just do it with an FPGA.

I suspect it'll go away in the long run. When it was introduced Gsync was the only way to do VRR, period. Now VRR support is getting super common and works pretty well with most things.
G-sync is more important than you thought. It controls backlight as well.

Display manufacturers don't make chips so they can only assemble and tune the chips in a limited way instead of designing chips themselves like how Apple specifically designed a TCON chip to control backlight 10 times faster than the fresh rate so it can reduce the latency and bloom of backlight of Pro Display XDR.
 
Back
Top