Considering an eGPU for FreeSync on nvidia - Am I crazy?

xp3nd4bl3

2[H]4U
Joined
Sep 25, 2004
Messages
2,259
I've been frustrated over the last few years at the limitations of wanting both VRR and a top-tier GPU forcing me the G-Sync route. I have two computers plugged into my Acer X34 G-Sync ultrawide; my gaming PC and my work Macbook Pro 13". I hate that the HDMI port is only good for 60hz (and I have to OC it to get that) and I'm ready for a higher refresh rate monitor. There are way more and arguably better options if I take G-Sync off my list of requirements.

So my thought is to get an eGPU with a low-end AMD GPU. My gaming PC has a TB3 port (though I think it's only x2 PCIe lanes, though it probably doesn't matter since that GPU won't be doing the heavy load). The idea would be to use this for both my PC and my Mac. For the Mac I'll get much better performance, and for my PC I can do Freesync while using my Nvidia GPU as the rendering device.

It all sounds kind of science-expiramenty but I used to have a Razer Blade Stealth and a Razer core and Windows' ability to select "preferred graphics processor" seemed to work just fine.

Can anyone think of why this would be a bad idea?

Of course if that new 38" LG G-Sync monitor indeed has an HDMI 2.0 port this may be a moot point and I could go that route...
 
Why would it matter if Macbook Pro 13" only does 60Hz?
It is not gaming device and in desktop 60Hz is more than enough... and would it even support more than 60Hz anyway?

In future we will have 8 core Ryzen 5 3600G with GPU and this will be awesome option to enable NV users to utilize FreeSync. Future Intel processors will also support Adaptive Sync so there is a chance similar trick will work on them also.

HDMI 2.0 changes nothing regarding VRR on NV cards. Maybe next generation will support it and adaptive sync but I would not bet on it...

For now I would recommend you to not upgrade monitor and wait
 
Eh wait for HDMI 2.1 monitors and hopefully AMD will have a GPU out by then that doesn't blow.

Yep, remember gsync doesn't support all games and glitches plenty (doesn't work in windowed/borderless sometimes, and also does not work with game engines where physics is tied to framerate just a few examples). Once HDMI 2.1/VRR becomes a standard with the new consoles, it should work with all games. At that point gsync will just blow, a proprietary solution that costs more and doesn't work as well. Nvidia can either support VRR at that point or just hope they can maintain a lead on AMD forever.
 
Yep, remember gsync doesn't support all games and glitches plenty (doesn't work in windowed/borderless sometimes, and also does not work with game engines where physics is tied to framerate just a few examples). Once HDMI 2.1/VRR becomes a standard with the new consoles, it should work with all games. At that point gsync will just blow, a proprietary solution that costs more and doesn't work as well. Nvidia can either support VRR at that point or just hope they can maintain a lead on AMD forever.
G-Sync is still better than AMD Adaptive Sync support
HDMI 2.1 wont improve anything from software point of view. AMD implementation will be the same as it is. NV if they decide to support HDMI VRR will work at most the same as G-Sync if not worse. Intel will also support VRR but I am not holding my breath regarding their drivers...

Not all games which do not work with G-Sync well but most do and for now it is the best VRR tech out there.
No one likes NV not supporting Adaptive Sync but it is no reason to hate the tech itself.
 
or just hope they can maintain a lead on AMD forever.

If one were to make an objective assessment, this would be probable.

Once HDMI 2.1/VRR becomes a standard with the new consoles, it should work with all games.

So should DX12...

At that point gsync will just blow, a proprietary solution that costs more and doesn't work as well.

It already works well. Better than Freesync, and there's very little reason to believe that Freesync2 will close that gap, given that it is less 'free' than Freesync is (which isn't 'free' for the consumer).
 
It already works well. Better than Freesync, and there's very little reason to believe that Freesync2 will close that gap, given that it is less 'free' than Freesync is (which isn't 'free' for the consumer).
Actually it is
Most FreeSync implementations are very simple and do pretty much what LCD electronics do anyway. Manufacturers just test if AdaptiveSync works with used scaler, do some tweaks, limit maximum frequency range to be on the safe side and this is your typical FreeSync monitor :ROFLMAO: And this is why they have ranges like 45-60Hz...
Some non-AdaptiveSync monitors do support it when you EDID telling AMD card they support it and that should tell you something :ROFLMAO:

Now, good FreeSync implementations like Freesync2 are more advanced and need fancier scalers but with electronics of this type you get what manufacturer put there and putting this stuff is not very expensive.
G-Sync is still more advanced than even Freesync2 and have its own memory and is designed around this feature. NV is also more serious about supporting their own technology and fix driver issues.

Ultimately from user perspective I can be sure G-Sync monitor will work well.
What is unfair is that NV purposefully keep this G-Sync exclusivity on their GPU's despite knowing well most of their users own now AdaptiveSync monitors so their strategy is making their customers get worse gaming experience than they would otherwise get. I am completely against this... but at the same time I will just get G-Sync monitors if that is what I have to do... :( hope my hard earned money will make NV happy :meh:
 
Back
Top