Nvidia's started supporting FreeSync

false alarm it seems

Update 9/30/18 3:22 AM: After further research and the collection of more high-speed camera footage from our G-Sync displays, I believe the tear-free gameplay we're experiencing on our FreeSync displays in combination with GeForces is a consequence of Windows 10's Desktop Window Manager adding some form of Vsync to the proceedings when games are in borderless windowed mode, rather than any form of VESA Adapative-Sync being engaged with our GeForce cards. Pending a response from Nvidia as to just what we're experiencing, I'd warn against drawing any conclusions from our observations at this time and sincerely apologize for any misleading conclusions we've presented in our original article. The original piece continues below for posterity.”
 
It's about fucking time. Now if we can just get everything on a single standard. I'm sick of display port, thunderbolt, usb 3c, HDMI, and all this horseshit.

I want to use the same home theater receiver I use for movies for my computer. Is it really that hard to get right? Fucking boobs.
 
Huge news. Massive hit for AMD.

Yup now the appeal of buying a Radeon GPU in order to pair it with a cheaper Freesync monitor is now entirely out the window since you can just do the same with a Geforce card. RIP.
 
Yup now the appeal of buying a Radeon GPU in order to pair it with a cheaper Freesync monitor is now entirely out the window since you can just do the same with a Geforce card. RIP.

I hope AMD can pull a rabit out of the hat with Navi. It would really suck if they left the GPU market.
 
I think this direction was pretty obvious given developments over the last little while especially with the heavier emphasis on HDMI 2.1 with respect to VRR support. The continued usage of of a FPGA and seemly no inclination to lower the unit costs (I don't mean end user price) to this point seemed to me it was unlikely they'd be continuing a full push on this.

The actual delay in doing so is likely considerations on how to -

- Market this so it gets associated with Nvidia or whatever brand they want and disassociate with the term Freesync.

- To minimize the disruption to the existing G-Sync market and ecosystem.
 
  • Like
Reactions: N4CR
like this
I'm excited. My FreeSync monitor is on their supported list so I guess that means it should work without a hitch (pun?). I can't wait to find out what all this sync rage is about.
 
Today is my birthday. Hence, CES week is a giant gift to me each and every year.

Between this news and full HDMI 2.1 coming to LG OLEDs, it's the best birthday ever!
 
I may forgo buying a gaming monitor now and just purchase a new LG OLED or Samsung FALD QLED this year :)

Now, if LG or Samsung (or both) could introduce a ~40" variant of those tv's, it would be awesome!
 
Makes sense for Nvidia, given that FreeSync was just about AMD's last marketing advantage.
 
Makes sense for Nvidia, given that FreeSync was just about AMD's last marketing advantage.

Since the bitcoin mining went down, AMD's cards offer good value in the mid range.

The driver suite is also insanely good now:
- The UI looks great
- Overclocking, voltages, fan profiles, even memory timing presets directly in the settings´
- Ingame overlay, with performance metrics and toggles for all the important features, etc.
- Radeon Chill is by far the best alternative to engine-capping for capping framerates to keep your games inside the adaptive sync range (yeah G-Sync is mostly 144+hz and might not need it, but you pay a huge markup for that).
 
Last edited:
Yeah, I get that. I was just pointing out that FreeSync was a big selling point for AMD.

Something that would make someone switch from Nvidia potentially or stay locked into AMD (if they already bought a monitor).

The other features in the driver are nice, but not deal makers. So now AMD is competing basically only on price.
 
Welcome to the party now if you can just use universal ray tracing code from DirectX12.
 
AMD could use that code too, show better performance, and make RTX look like shit. If they can deliver...
At some point, yeah, but I feel like they are far behind right now (probably at least by a year), so even if they do pull if off, Nvidia would have likely progressed as well.
 
AMD could use that code too, show better performance, and make RTX look like shit. If they can deliver...
Oh I completely agree its just that they already do support it.
However I would like to see what happens with real time ray tracing with AMD actually looks like atm? Given Nvidia's giants can't even drive it correctly I doubt we could either so I get why it isn't being discussed atm.
 
Excellent news, now I can buy one these suckers.

https://rog.asus.com/articles/gamin...n-gaming-with-freesync-hdr-2-at-4k-and-120hz/

Enjoy the benefits of 120hz VRR gaming on PC via display port, then play my xbox one X with VRR and plenty of ports for my ps4 pro too(3x2.0b). All on the same display.

Excellent news and good upgrade path for my current Sony 43X720e.
As I said in the other thread, it looks to be the same panel the Wasabi Mango UHD430 uses and supposedly they're going to enable Freesync for it at some point.
 
Very good news. I'm interested in how it looks with a 1000 or 2000 series nvidia gpu on a samsung Q9FN with dynamic FALD and HDR enabled on a HDR pc game. I'm looking to their continuation of that product line in 2019 with hdmi 2.1 hopefully. The 2018 Q9FN already supports freesync,VRR and 1440p 120hz at very low ~10ms input lag.

At this point I'm about to leave monitors altogether for gaming purposes and only use them for desktop/apps, if that (I already have a 43" 4k 60hz VA tv in my desktop monitor array for desktop real-estate and media playback). Though I'll have to re-arrange my pc battlestation for large tv distances it could work out even better in the long run.
 
As I said in the other thread, it looks to be the same panel the Wasabi Mango UHD430 uses and supposedly they're going to enable Freesync for it at some point.

No. The Mango uses a cheap IPS panel. This 43" ROG is a 10-bit VA panel with 600 nits. Totally different league.
 
Yup. It also has much higher color volume (90% DCI P3 IIRC)
That explains why it uses Freesync 2. 600nits is low for HDR, correct? I thought 1000 was considered the minimum?

What is the expected price on this? $2k?
 
Last edited:
That explains why it uses Freesync 2. 600nits is low for HDR, correct? I thought 1000 was considered the minimum?

What is the expected price on this? $2k?
While 600 nits is a bit lower than ideal for HDR, the bigger issue (for HDR) is that this monitor does not have FALD nor is an OLED. Thus, I don't ding it too much for not being ideal for HDR because it isn't really designed to be.
 
  • Like
Reactions: N4CR
like this
This is something long overdue. I'm going to retain some skepticism that it's for real until we see how it well it pans out. But some of us would be perfectly happy with 80-100hz and a variable without a lot of extra. This may well keep us as NV customers :)
 
According to a reddit user who owns one of the modern samsung hdr tvs, it depends on the resolution.
The hdmi 2.0b 60hz freesync implementation at 4k is only 48Hz - 60Hz so practically useless, but that is a 60hz cap of the resolution in general already.

Hopefully it scales into the higher ranges with hdmi 2.1 so it would be 20hz - 120hz like it does for 1080p. Their tvs are also capable of running 1440p resolutions. If they could do 20 - 120 at 1440p and 4k 120hz that would be a great usage scenario as long as your frame rate graph didn't dip below 20fps which would be pretty bad range to be using anyway imo. 90 or 100FpsHz average should range mostly around 60 - 90 - 120 or my preference of 100 ave => 70 - 100 - 130 (160 spikes even possible). You can cap the frame rate to 118 on the top end.


HDMI 2.0b model
Freesync range on this TV depends on what resolution you are running at.

@ 4k the range is 48-60hz

@ 1080p the range is <20-120hz

I have the q8fn, so very similar tv just the slight step down model.

Per linus tech tips on the Q9F . "Ultimate" range 1080p was 48 - 120 fps.

Their flagship is different.. Q9 / Q9Fn ... can't find any Hz range info on rtings or anywhere on that model yet. It'll change on hdmi 2.1 anyway though as 4k 120hz opens up.
 
Last edited:
According to a reddit user who owns one of the modern samsung hdr tvs, it depends on the resolution.
The hdmi 2.0b 60hz freesync implementation at 4k is only 48Hz - 60Hz so practically useless, but that is a 60hz cap of the resolution in general already.

Hopefully it scales into the higher ranges with hdmi 2.1 so it would be 20hz - 120hz like it does for 1080p. Their tvs are also capable of running 1440p resolutions. If they could do 20 - 120 at 1440p and 4k 120hz that would be a great usage scenario as long as your frame rate graph didn't dip below 20fps which would be pretty bad range to be using anyway imo. 90 or 100FpsHz average should range mostly around 60 - 90 - 120 or my preference of 100 ave => 70 - 100 - 130 (160 spikes even possible). You can cap the frame rate to 118 on the top end.


HDMI 2.0b model


Per linus tech tips on the Q9F . "Ultimate" range 1080p was 48 - 120 fps.

Their flagship is different.. Q9 / Q9Fn ... can't find any Hz range info on rtings or anywhere on that model yet. It'll change on hdmi 2.1 anyway though as 4k 120hz opens up.
I think I heard a youtuber (maybe Linus) say that nVidia isn’t supporting adaptive sync via HDMI.
 
Last edited:
Hope that's not true. That would really suck. AMD gpus and xbox both do.


edit: thanks for the heads up
Linus' Video confirmed two important things - thanks /u/evaporates for the tldw:
  • The confirmation that NVIDIA is not charging anything to monitor manufacturers to get G-Sync Compatible certified.
  • HDMI is not currently supported but they do not rule out future support
 
Last edited:
Uh, not supported through HDMI? I knew there had to be some catch. What a load of fucking crap. That basically eliminates OLED TV support. Holy cow these companies fucking suck.
 
Maybe they will skip it until their gpus have HDMI 2.1 on them. I hope it's not that long though.
 
Back
Top