First Time Team Red - thoughts and ideas?

Elf_Boy

2[H]4U
Joined
Nov 16, 2007
Messages
2,545
I am on the wire between the 3080 fe and 6800 xt.

I think the big decider is I have an HDMI FreeSync monitor that NV just wont support. That with the greater memory size which I am hoping is good future proofing.

From those who are familiar with AMD GPUs how does the AMD stuff work? Is it a control panel like NV? Anything I need to know? Tips for performance? When it comes to the AMD software I dont know what I dont know - I'll figure it all out in time, of course, I am a true [H] member :), Still some shared knowledge helps things along.

Of course I have no idea when I will be able to find one in stock.
 
Wont support does not mean it wont work, did you try it already?
 
I am on the wire between the 3080 fe and 6800 xt.

I think the big decider is I have an HDMI FreeSync monitor that NV just wont support. That with the greater memory size which I am hoping is good future proofing.

From those who are familiar with AMD GPUs how does the AMD stuff work? Is it a control panel like NV? Anything I need to know? Tips for performance? When it comes to the AMD software I dont know what I dont know - I'll figure it all out in time, of course, I am a true [H] member :), Still some shared knowledge helps things along.

Of course I have no idea when I will be able to find one in stock.
AMD has a driver control panel, but it is very different from Nvidia's driver control panel.

AMD's driver interface is a more modern, graphical take. And its generally much easier to look at and use. Its also more responsive. Nvidia's gamer overlay menu is more like AMD's stuff, than Nvidia Control panel or even Geforce Experience. AMD also do not split features and settings out into a separate program, like Nvidia does with Geforce Experience. Nvidia also recently started putting power management and overclocking features into their game overlay, which is annoying. You basically have to flip between 3 different menu systems/programs, to access all of Nvidia's features, now.

After that, they are more or less even, IMO. In terms of features offered in their driver control panels, whether or not they work, if you can even see the feature (depending upon the model of GPU you have), etc.

For things AMD does strictly better: I'd say their power managment features are better designed for someone to make a couple of clicks, get results, and understand what they are even doing. And their overclocking features are better. 1. AMD's auto overclocking takes a few seconds Vs. ten minutes with Nvidia. 2. AMD lets you overclock VRAM. Nvidia does not. You have to use 3rd party programs to overclock Nvidia VRAM. and to overclock the GPU core, you have to install Geforce Experience, activate the option to do it, and then you actually do it in Nvidia's game overlay menu....

AMD also has better streaming features built into their driver. With enough functionality to act as sort of an OBS-lite.
 
Last edited:
AMD has a driver control panel, but it is very different from Nvidia's driver control panel.

AMD's driver interface is a more modern, graphical take. And its generally much easier to look at and use. Its also more responsive. Nvidia's gamer overlay menu is more like AMD's stuff, than Nvidia Control panel or even Geforce Experience. AMD also do not split features and settings out into a separate program, like Nvidia does with Geforce Experience. Nvidia also recently started putting power management and overclocking features into their game overlay, which is annoying. You basically have to flip between 3 different menu systems/programs, to access all of Nvidia's features, now.

After that, they are more or less even, IMO. In terms of features offered in their driver control panels, whether or not they work, if you can even see the feature (depending upon the model of GPU you have), etc.

For things AMD does strictly better: I'd say their power managment features are better designed for someone to make a couple of clicks, get results, and understand what they are even doing. And their overclocking features are better. 1. AMD's auto overclocking takes a few seconds Vs. ten minutes with Nvidia. 2. AMD lets you overclock VRAM. Nvidia does not. You have to use 3rd party programs to overclock Nvidia VRAM. and to overclock the GPU core, you have to install Geforce Experience, activate the option to do it, and then you actually do it in Nvidia's game overlay menu....

AMD also has better streaming features built into their driver. With enough functionality to act as sort of an OBS-lite.
Thank you.
 
I am on the wire between the 3080 fe and 6800 xt.

I think the big decider is I have an HDMI FreeSync monitor that NV just wont support. That with the greater memory size which I am hoping is good future proofing.

From those who are familiar with AMD GPUs how does the AMD stuff work? Is it a control panel like NV? Anything I need to know? Tips for performance? When it comes to the AMD software I dont know what I dont know - I'll figure it all out in time, of course, I am a true [H] member :), Still some shared knowledge helps things along.

Of course I have no idea when I will be able to find one in stock.

The only thing that separates the two imo which would force one to go one direction or the other is when it comes to encoding and the support in broad terms for the eco system. In this respect Nvidia beats AMD. If you don't do a lot of encoding or if you use cpu encoding, then I'd go AMD because you get so much more card and for less. Personally I'm torn cuz I wanna go AMD but I also don't want to lose NVENC support which all of my encoding apps work well with. I'm optimistic that AMD will catch up in that regard yet I'm still hesitant to replace my Titan XP.
 
The only thing that separates the two imo which would force one to go one direction or the other is when it comes to encoding and the support in broad terms for the eco system. In this respect Nvidia beats AMD. If you don't do a lot of encoding or if you use cpu encoding, then I'd go AMD because you get so much more card and for less. Personally I'm torn cuz I wanna go AMD but I also don't want to lose NVENC support which all of my encoding apps work well with. I'm optimistic that AMD will catch up in that regard yet I'm still hesitant to replace my Titan XP.
Good info, Thanks.

I do not do encoding so looking a lot like the 6800xt. If I can find one in stock ever.
 
The biggest issue with AMD GPU's is the software. Somehow this has been the case even before AMD purchased ATI. I had a cheap Assrcok 5700xt. It was so disappointing I got a Sapphire 5700xt. I ended up settling on the 2070S. My personal systems have been about 50/50 Radeon and Geforce GPUs. The 6800xt looks like a great card but if price doesn't matter I would grab the 3080. If you want to save money and get almost the same performance then get the 6800xt. I have a Gsync 144hz and a Freesync 165hz montior. AMD and Nvidia cards work with both using gsync or freesync. No matter which one you go with I'd use MSI Afterburner to tweak the GPU.
 
The biggest issue with AMD GPU's is the software. Somehow this has been the case even before AMD purchased ATI.
This^ I would also like to go with AMD but. PC Mag recently reviewed the new AMD cards and mentioned the driver issues as a serious negative. So I guess I have to Pay The Man to get a stable GPU card.
 
The biggest issue with AMD GPU's is the software. Somehow this has been the case even before AMD purchased ATI. I had a cheap Assrcok 5700xt. It was so disappointing I got a Sapphire 5700xt. I ended up settling on the 2070S. My personal systems have been about 50/50 Radeon and Geforce GPUs. The 6800xt looks like a great card but if price doesn't matter I would grab the 3080. If you want to save money and get almost the same performance then get the 6800xt. I have a Gsync 144hz and a Freesync 165hz montior. AMD and Nvidia cards work with both using gsync or freesync. No matter which one you go with I'd use MSI Afterburner to tweak the GPU.
Thank you for the information. What specifically would be the issue(s).

How are you plugged into the monitors? I am guessing display port. Not an option for me sadly, my monitors only have HDMI, if I want freesync I'll have to go AMD.

Also FYI: https://www.tomshardware.com/news/gsync-monitor-with-amd-graphics-card-nvidia
 
If you want 60 FPS at 4K, Nvidia is your only option really. DLSS is key. DLSS 2.1 is more or less identical to native 4K in terms of visuals. And that’s my personal opinion. I can’t really tell any difference. A lot of these games like Tomb Raider, Metro Exodus, Control with raytracing...and probably Cyberpunk are doing 45, 50 FPS at 4K. You turn on DLSS and it gets you comfortably over that 60 FPS hump.
Now let’s say you on a 1080p or 1440p monitor, well in that case, go turn on Nvidia DSR 4K in the control panel. It will produce the best image on a 1080p or 1440p monitor you will ever see. Basically 4K DSR makes the GPU output at 4K and then it shrinks it down on your lower res screen.

I’m on a ten year old 24 inch 1080p monitor. I’ve had the 3080 FE for two months now and I play everything at 4K DSR with full raytrace ultra settings, and DLSS in titles that support it. It’s really quite amazing. Upgraded from a 980 TI. And the thing is, I’ve been out of the gaming loop for 5 years so I missed a lot of stuff. Going back now and playing all the epic greatest hits from last 5 years at 4K 60 FPS. You got no idea how awesome games like TitanFall 2 or Batman Arkham Knight looks in 4K DSR AT 60 FPS until you see it for yourself.

With the 6800 xt, you’re not going to be able to do 60 FPS at 4K with raytracing. You be getting 30 to 50 FPS average. Now Direct ML, the competitor to DLSS might change this in the future. But who knows....
 
Last edited:
If you want 60 FPS at 4K, Nvidia is your only option really. DLSS is key. DLSS 2.1 is more or less identical to native 4K in terms of visuals. And that’s my personal opinion. I can’t really tell any difference. A lot of these games like Tomb Raider, Metro Exodus, Control with raytracing...and probably Cyberpunk are doing 45, 50 FPS at 4K. You turn on DLSS and it gets you comfortably over that 60 FPS hump.
Now let’s say you on a 1080p or 1440p monitor, well in that case, go turn on Nvidia DSR 4K in the control panel. It will produce the best image on a 1080p or 1440p monitor you will ever see. Basically 4K DSR makes the GPU output at 4K and then it shrinks it down on your lower res screen.

I’m on a ten year old 24 inch 1080p monitor. I’ve had the 3080 FE for two months now and I play everything at 4K DSR with full raytrace ultra settings, and DLSS in titles that support it. It’s really quite amazing. Upgraded from a 980 TI. And the thing is, I’ve been out of the gaming loop for 5 years so I missed a lot of stuff. Going back now and playing all the epic greatest hits from last 5 years at 4K 60 FPS. You got no idea how awesome games like TitanFall 2 or Batman Arkham Knight looks in 4K DSR AT 60 FPS until you see it for yourself.

With the 6800 xt, you’re not going to be able to do 60 FPS at 4K with raytracing. You be getting 30 to 50 FPS average. Now Direct ML, the competitor to DLSS might change this in the future. But who knows....
Isnt DLSS limited to a pretty short list of titles? Only 1 of which is a game I am playing. As I recall NV needs to train the AI for DLSS and they dont seem to be in a hurry with what I mostly play.

I have also read AMD is working on something equivalent to DLSS until that is out we wont know how it impacts FSP.

I'm currently looking hard at the 6900xt, my quarterly bonus was good to me, and at $500 less then the 3090 I think it does well enough.
 
If you want 60 FPS at 4K, Nvidia is your only option really. DLSS is key. DLSS 2.1 is more or less identical to native 4K in terms of visuals. And that’s my personal opinion. I can’t really tell any difference. A lot of these games like Tomb Raider, Metro Exodus, Control with raytracing...and probably Cyberpunk are doing 45, 50 FPS at 4K. You turn on DLSS and it gets you comfortably over that 60 FPS hump.
Now let’s say you on a 1080p or 1440p monitor, well in that case, go turn on Nvidia DSR 4K in the control panel. It will produce the best image on a 1080p or 1440p monitor you will ever see. Basically 4K DSR makes the GPU output at 4K and then it shrinks it down on your lower res screen.

I’m on a ten year old 24 inch 1080p monitor. I’ve had the 3080 FE for two months now and I play everything at 4K DSR with full raytrace ultra settings, and DLSS in titles that support it. It’s really quite amazing.
Uh... you have a 1080p monitor and you bought a 3080 for it? And you render things at 4K even though you can’t display it?

I don’t understand the world anymore.
 
Upscale then downscale that's the key to best image quality.

"Upscale then downscale that's the key to best *marketing campaign to make consumers spend way more than they actually need to*."

Fixed that for you.
 

:facepalm: Comparing video frames to rendered 3d frames makes absolutely no sense and reveals absolute ignorance about how images are processed/displayed on your screen.

A) Video compresses colors or luminance (4:4:4 vs 4:2:2 vs 4:2:0). That's why 4K video represented at 1080p looks better than captured 1080p, because even when color information is compressed, the 1080p output is exactly 1/4 of the 4K resolution, so 1/4 spatial, but the color information of 4K, which is 1/4 if compressed, will represent 1/1 in the 1080p result. Meaning: captured compressed 1080p is not really 1080p, is way crappier. Captured compressed 4K has similar information to 4:4:4 1080p.

B) 3d rendered frames don't work like video. You essentially get a 4:4:4-like image every frame, all the time. So, rendering 4x the pixels at 4:4:4 and then giving up 3/4 of the rendered budget is supremely stupid. You will get 1080p at 4:4:4 (again 3d frames don't work like this, I say this just so we can establish the connection, but 3d frames are not ever X : Y : Z) and have wasted the rest of the perfectly rendered material. Your screen cannot show pixels it doesn't have.

C) There are some tiny, minute gains in AA clarity when you run at 4K and downsample to 1080p, similar to the gains one would get with Supersampling instead of any other reconstructive AA technique (because Supersampling is basically that, rendering at higher res and then downsampling). However, if you think you're reeeeeeeally going to notice those minute differences while you're playing a game, moving your character around, in a firefight, or in combat, or with your brain engaged in literally anything else other than comparing A/B images to notice the difference... you're lying to yourself.

The benefits of rendering a resolution that you cannot display, for the majority of consumers, is pure marketing brainwashing to get you to buy something way more expensive than you need. You can put an F1 engine in an Honda Civic, it ain't gonna run any better. You can cook something in your oven at 500F five times as fast as something that needs 100F, it ain't going to end well.
 
Last edited:
I am on the wire between the 3080 fe and 6800 xt.

I think the big decider is I have an HDMI FreeSync monitor that NV just wont support. That with the greater memory size which I am hoping is good future proofing.

From those who are familiar with AMD GPUs how does the AMD stuff work? Is it a control panel like NV? Anything I need to know? Tips for performance? When it comes to the AMD software I dont know what I dont know - I'll figure it all out in time, of course, I am a true [H] member :), Still some shared knowledge helps things along.

Of course I have no idea when I will be able to find one in stock.
what do you do with your system? is it "just" to play games or do you do content creation? In which case some applications prefers one over the other

tell us more if you haven't already

Henrik
 
:facepalm: Comparing video frames to rendered 3d frames makes absolutely no sense and reveals absolute ignorance about how images are processed/displayed on your screen.

A) Video compresses colors or luminance (4:4:4 vs 4:2:2 vs 4:2:0). That's why 4K video represented at 1080p looks better than captured 1080p, because even when color information is compressed, the 1080p output is exactly 1/4 of the 4K resolution, so 1/4 spatial, but the color information fof 4K, which is 1/4 if compressed, will represent 1/1 in the 1080p result. Meaning: captured compressed 1080p is not really 1080p, is way crappier. Captured compressed 4K has similar information to 4:4:4 1080p.

B) 3d rendered frames don't work like video. You essentially get a 4:4:4-like image every frame, all the time. So, rendering 4x the pixels at 4:4:4 and then giving up 3/4 of the rendered budget is supremely stupid. You will get 1080p at 4:4:4 (again 3d frames don't work like this, but so we can establish the connection, but 3d frames are not ever X(y)Z) and have wasted the rest of the perfectly rendered material. Your screen cannot show pixels it doesn't have.

C) There are some tiny, minute gains in AA clarity when you run at 4K and downsample to 1080p, similar to the gains one would get with Supersampling instead of any other reconstructive AA technique (because Supersampling is basically that, rendering at higher res and then downsampling). However, if you think you're reeeeeeeally going to notice those minute differences while you're playing a game, moving your character around, in a firefight, or in combat, or with your brain engaged in literally anything else other than comparing A/B images to notice the difference... you're lying to yourself.

The benefits of rendering a resolution that you cannot display, for the majority of consumers, is pure marketing brainwashing to get you to buy something way more expensive than you need. You can put an F1 engine in an Honda Civic, it ain't gonna run any better. You can cook something in your oven at 500F five times as fast as something that needs 100F, it ain't going to end well.

Bro, It's real simple. 4k DSR on a 1080p screen worse or better than just regular 1080p? Not interested in all the theoretical mumbo jumbo. Hands down 4k DSR is great. Now why don't I just go buy some 4k monitors? Cause I just don't feel like dropping 1600 bucks on some computer monitors. I run dual monitor setup for stock trading and seems like a waste of money. If I had to make the move to 4k screen, I would just buy a 40 inch top end OLED or something like that. So we work with what we have. For anyone with 1080p monitors, and that's alot of people, Turn on DSR 4k and it's the best image on a 1080p screen you will ever see.

I just go by my own eyes and other people's first hand experience. Here's a whole thread of them.

https://www.nvidia.com/en-us/geforc...634/dsr-if-you-havent-tried-it-yet-do-it-now/


Here's some dude that went through the trouble of testing it on all sort of games. One side clearly looks better than the other. Looks a whole lot better in person. DSR sort of became forgotten because it tanked the FPS to like 20 to 30 fps. But now with these new generation of cards, DSR absolutely playable and breathes new life into 1080p monitors.


 
Last edited:
Bro, It's real simple. 4k DSR on a 1080p screen worse or better than just regular 1080p? Not interested in all the theoretical mumbo jumbo.
Cool. Let's check the material from your source:

Video frame:
Screenshot (4).png


Native 1080p:
native.jpg


DSR:
dsr.jpg


Great. So with a couple minutes to look for high-frequency detail like that fence, and then look at an A/B side comparison, we can all agree that DSR looks sharper. Now for the kicker:

Screenshot (4) - Copy.png


Ah yes. Reducing my framerate to 1/3 just for a small increase in clarity that I will definitely barely notice while I'm concentrated on anything else that's not comparing A/B image quality samples will be totally worth it.

Look, I'm not saying that what you're doing doesn't have an improved effect. Anyone with a minor understanding of how rendering works knows that. But don't pretend that what you're doing isn't a waste of that card's power, and your money, for the usage you're giving it. It's OK, we can waste things if we want to, but being able to waste anything doesn't automatically make it the smart thing to do. What you are doing is attempting to overcome your monitor's shortcomings with a capped, shortsighted method. I don't know where you got the $1600 number, but you can certainly buy a couple 4K 27" monitors for $500 in total, or literally any 40" 4K TV that'll display 4:4:4 for $300 these days (to understand that last part, you'll have to re-read my previous post's "mumbo jumbo" of course). And if you went this route, you could then run games at 4K and get the full benefit instead of toying around with DSR. Now I don't know if you can use DSR and DLSS at the same time, but I really hope you're not doing that, because then you're just rendering at a lower than a 1080p resolution, to upscale to 1080p via DLSS, to then upscale to 4K and downscale back to 1080p via DSR... and now this is just silly.

If you still think DSR is not a marketing strategy to make you spend more than you need, at this point... well, embrace ignorance then, I guess. If you’re happy, more power - and less money - to you.
 
Last edited:
Cool. Let's check the material from your source:

Video frame:
View attachment 302483

Native 1080p:
View attachment 302484

DSR:
View attachment 302485

Great. So with a couple minutes to look for high-frequency detail like that fence, and then look at an A/B side comparison, we can all agree that DSR looks sharper. Now for the kicker:

View attachment 302487

Ah yes. Reducing my framerate to 1/3 just for a small increase in clarity that I will definitely barely notice while I'm concentrated on anything else that's not comparing A/B image quality samples will be totally worth it.

Look, I'm not saying that what you're doing doesn't have an improved effect. Anyone with a minor understanding of how rendering works knows that. But don't pretend that what you're doing isn't a waste of that card's power, and your money, for the usage you're giving it. It's OK, we can waste things if we want to, but being able to waste anything doesn't automatically make it the smart thing to do. What you are doing is attempting to overcome your monitor's shortcomings with a capped, shortsighted method. I don't know where you got the $1600 number, but you can certainly buy a couple 4K 27" monitors for $500 in total, or literally any 40" 4K TV that'll display 4:4:4 for $300 these days (to understand that last part, you'll have to re-read my previous post's "mumbo jumbo" of course). And if you went this route, you could then run games at 4K and get the full benefit instead of toying around with DSR. Now I don't know if you can use DSR and DLSS at the same time, but I really hope you're not doing that, because then you're just rendering at a lower than a 1080p resolution, to upscale to 1080p via DLSS, to then upscale to 4K and downscale back to 1080p via DSR... and now this is just silly.

If you still think DSR is not a marketing strategy to make you spend more than you need, at this point... well, embrace ignorance then, I guess. If you’re happy, more power - and less money - to you.
Why do people buy a 700 horsepower car and drive it around their neighborhood at 40 miles per hour? Cause they can. And how could DSR be a marketing strategy if I didn't even know about it until AFTER I bought the 3080? And DSR comes in most handy for big TV's, which alot of people have now. All those yearly TV sales at Walmart for 60, 70, 80 inch TV's and big screen 1080p TV's are really common. I got a 75 inch 1080p hanging in my living room for 5 years now. Hooking the 3080 up to the TV, DSR makes a world of difference. No DSR and it looks like crap. Turn on 4k DSR and it's brilliant.

It's like you trying to convince yourself a Honda Accord gets the job done and that Dodge Viper is just overkill.....
 
Why do people buy a 700 horsepower car and drive it around their neighborhood at 40 miles per hour? Cause they can. And how could DSR be a marketing strategy if I didn't even know about it until AFTER I bought the 3080? And DSR comes in most handy for big TV's, which alot of people have now. All those yearly TV sales at Walmart for 60, 70, 80 inch TV's and big screen 1080p TV's are really common. I got a 75 inch 1080p hanging in my living room for 5 years now. Hooking the 3080 up to the TV, DSR makes a world of difference. No DSR and it looks like crap. Turn on 4k DSR and it's brilliant.

It's like you trying to convince yourself a Honda Accord gets the job done and that Dodge Viper is just overkill.....
Buddy you don't need to try to convince me. You can like what you like. I just don't understand how you're cool dropping $700 for a 3080, but then you can't spend a mere $240 on a 27" 4K monitor to really enjoy that 3080 at its full capabilities and instead prefer to use DSR on a monitor that is insanely sub-par when paired with the 3080. Even if you loved 1080p for some reason, you could run pixel-perfect 1080p on a 4K monitor because it's a perfect 4:1 split.
 
Buddy you don't need to try to convince me. You can like what you like. I just don't understand how you're cool dropping $700 for a 3080, but then you can't spend a mere $240 on a 27" 4K monitor to really enjoy that 3080 at its full capabilities and instead prefer to use DSR on a monitor that is insanely sub-par when paired with the 3080. Even if you loved 1080p for some reason, you could run pixel-perfect 1080p on a 4K monitor because it's a perfect 4:1 split.
Bought a 3080 because I needed a 60 fps guaranteed framerate for all games for next 5 years from 4k all the way to 1080p. So you say 3080 is overkill for 1080p. Maybe, maybe not. Just a mere months after the 3080 released, we already have Valhalla and WatchDog Legions that will not hit 60fps at 4k and it's struggling sometimes at 1440p too. So how you so sure 3080 will be enough in the future for lower resolutions? Better safe than sorry. Here's to prove my point. WatchDogs already getting pretty close to 60 fps a 1440p. How you know the next WatchDog game in 2022 will perform better or even worse?



JHTnqYMn552xMpAsWTpDjE-970-80.png
 
Last edited:
Bought a 3080 because I needed a 60 fps guaranteed framerate for all games for next 5 years from 4k all the way to 1080p. So you say 3080 is overkill for 1080p. Maybe, maybe not. Just a mere months after the 3080 released, we already have Valhalla and WatchDog Legions that will not hit 60fps at 4k and it's struggling sometimes at 1440p too. So how you so sure 3080 will be enough in the future for lower resolutions? Better safe than sorry. Here's to prove my point. WatchDogs already getting pretty close to 60 fps a 1440p. How you know the next WatchDog game in 2022 will perform better or even worse?
None of what you just said proves your point. You are rendering at 4K right now, just not displaying it in your 1080p monitor. So all this discussion about future-proofing (apart from being ridiculous, buy for what you want to play now, not years later, you cannot predict this) applies to you just as anyone else. You can buy a 4K monitor and not play games at 4K. You can play games at 1080p and it won't look fuzzy (like a QHD monitor displaying FHD) precisely because it's a 4:1 exact ratio. So, your options:

1) 1080p monitor: render at 4K with DSR, then not see most of the benefit.
2) 4K monitor: render at 4K and see the benefit, or render at 1080p and see clear, perfect 1080p with faster performance.

This means there is no downside to getting a 4K monitor for $240 right now. Meanwhile, using your current 1080p monitor is a clear waste of your card's capabilities. Can you do it? Yes. Is it wasteful? Absolutely.

And keep in mind those options have nothing to do with future proofing, that's a different discussion. That discussion tends to end on: you can't future proof, because as you yourself said, you cannot know what's coming down the pipe. So there's no point on future proofing and you should buy what's available now for the games you want to play now.

And because we're straying too far from the post of this thread, and I don't like hijacking threads, to the OP: you won't find tremendous differences between AMD and Nvidia cards. I've had both, and it usually comes down to how X or Y developer optimizes each specific game. I've found that with native DX12 games my AMD cards used to perform better than Nvidia, but at the same time they used to consume a lot of power so I switched back to Nvidia for a cooler, more silent system. In the past 3 years, I've been frustrated with VRAM, so if AMD gives you more VRAM for the same money as Nvidia, I'd consider it (unless you're going to go heavy into raytracing, in that case go with Nvidia, because they are 1gen more advanced than AMD and it shows right now). The AMD settings menu is very different from Nvidia's, it takes a bit of getting used to if you've had Nvidia before, but they're both perfectly workable and work fine.
 
None of what you just said proves your point. You are rendering at 4K right now, just not displaying it in your 1080p monitor. So all this discussion about future-proofing (apart from being ridiculous, buy for what you want to play now, not years later, you cannot predict this) applies to you just as anyone else. You can buy a 4K monitor and not play games at 4K. You can play games at 1080p and it won't look fuzzy (like a QHD monitor displaying FHD) precisely because it's a 4:1 exact ratio. So, your options:

1) 1080p monitor: render at 4K with DSR, then not see most of the benefit.
2) 4K monitor: render at 4K and see the benefit, or render at 1080p and see clear, perfect 1080p with faster performance.

Those options have nothing to do with future proofing, that's a different discussion. That discussion tends to end on: you can't future proof, because as you yourself said, you cannot know what's coming down the pipe. So there's no point on future proofing and you should buy what's available now for the games you want to play now.

And because we're straying too far from the post of this thread, and I don't like hijacking threads, to the OP: you won't find tremendous differences between AMD and Nvidia cards. I've had both, and it usually comes down to how X or Y developer optimizes each specific game. I've found that with native DX12 games my AMD cards used to perform better than Nvidia, but at the same time they used to consume a lot of power so I switched back to Nvidia for a cooler, more silent system. In the past 3 years, I've been frustrated with VRAM, so if AMD gives you more VRAM for the same money as Nvidia, I'd consider it (unless you're going to go heavy into raytracing, in that case go with Nvidia, because they are 1gen more advanced than AMD and it shows right now). The AMD settings menu is very different from Nvidia's, it takes a bit of getting used to if you've had Nvidia before, but they're both perfectly workable and work fine.


Here ya go bro.....point is buy the best when you can buy the best, or find yourself in situations down the road where you wished you bought the best.


untitled-25.png
 
4k to 1080p with DSR at 0% smoothness does look amazing and gives the best anti aliasing one you could dream of. In general supersampling at 2x, 4x etc looks great but it's a lot of wasted horsepower, it's the most inefficient AA technique there is.

A native 4k monitor looks way way better though (and you can afford a bigger screen size without things looking bad), even though you still do need some AA in most modern games. But there are more and more games with very good (as in: good looking and with a tiny performance impact) AA techniques these days so it's not really an issue (DSR has a small fps hit too because of the scaling so it's probably a draw).
 
Last edited:
Nope not showing as an option to enable.
You need to make sure your monitor has freesync enabled in its settings menus, as well. Usually, you can turn it off and it may be off. The Nvidia control panel will not show gsync/freesync settings, unless your monitor has those things turned on.
 
Thank you for the information. What specifically would be the issue(s).

How are you plugged into the monitors? I am guessing display port. Not an option for me sadly, my monitors only have HDMI, if I want freesync I'll have to go AMD.

Also FYI: https://www.tomshardware.com/news/gsync-monitor-with-amd-graphics-card-nvidia
I would experience random software crashes. My settings would not automatically load and i would have to manually load it every time. There was a time where my profile would not save at all so I had to input settings after reboot or Adrenalin software crash. Without tweaking the GPU my 5700xt performed closer to a 2060S. After a little OC my 5700xt ran almost as fast as a stock 2070S. In the end I had a pretty stable setup with the 5700XT but I would not rate the software stability higher than a B. Noise volume was the deciding factor in me choosing the 2070S. My 5700XT fans were almost 3000RPM and what sounded like 80-90dba. I could hear the fans while gaming with a headset and volume turned up. The heat/sound issue seems to have been resolved in the 6xxx series though. Without any xp with a 3080 or 6800XT my purchase would come down to price/performance.

I use DP with my primary monitor and DP to HDMI cables with my older 1080p monitors.
 
what do you do with your system? is it "just" to play games or do you do content creation? In which case some applications prefers one over the other

tell us more if you haven't already

Henrik
Mostly play games. Some light excel/word use, web browse, Netflix/Amazon Prime so decoding streaming video at 4k HDR faster would be nice. Not that my system seems to have issues with that as it is.

I'd like to get a VR headset - maybe next quarter with my next bonus.
 
You need to make sure your monitor has freesync enabled in its settings menus, as well. Usually, you can turn it off and it may be off. The Nvidia control panel will not show gsync/freesync settings, unless your monitor has those things turned on.
Pretty sure it is on, but I'll double check.
 
Dude, with Nvidia you can use DLSS to upscale 1440p to 4K then use DSR to downscale 4K to 1080p. It's win/win! The more you buy, the more you save!!!
 
This is my NV Control Panel and I have verified FreeSync is enabled in the Monitor.

1606250821301.png


No GSync option.
 
This is my NV Control Panel and I have verified FreeSync is enabled in the Monitor.

View attachment 302607

No GSync option.
and are you connected with display port?

**oh we went over this, earlier in the thread. I forgot and then more recent posts started talking about the driver co trol panel stuff.

Anyway, the only display which NVidia supports freesync/gsync compatibility over HDMI, are 2020 LG OLED TVs.
No freesync monitors work with gsync, over HDMI.
 
Back
Top