The Current Dilemma: RTX 3080 vs 6900 XT

The only reason here to go with the 6800XT---is simply to have something different. Performance is more or less a wash. And Nvidia wins on extra features such as Better RT performance, DLSS support, Nvidia Broadcast, industry leading video encoder quality.
Have one already. Currently running is a 3090,3070, 6800xt, and a 2080ti. This is the last system.
 
Simply a matter of bandwidth and Nv has the wider bus this round. AMD is working on that I'll bet
I guess you haven't been following any rumors then because they're just going to increase that Infinity cache and stick with the same bus width. And the bus width is not even really important as it's the overall performance that you get from the GPU that matters. Most people even with higher end gpus are at 1440p not 4k so if anything AMD is the better performer for many due to the poor scaling and overhead that Nvidia has with ampere. It's just silly that I see so many people running a 3090 at 1440p especially with an older CPU. Hell I've even seen a few idiots running 3090s and 3080 Tis at 1080p.
 
Last edited:
Depends on your setup. I was running a 2080 Ti with a 1080p ultrawide for a while and it was great.

In my case, the screen was 165 Hz, so I could get great high refresh performance even on very demanding games. And on lighter games I ran DSR 5K and it was awesome.

But 1080p is kind of low these days, so I run a 1440p UW now and I can usually get in the 100 fps range (give or take) at high settings. It was fun to max out games and get 165 fps, but 100 fps is still good with FreeSync.
 
I used a 3080 TI for a few days and it was just twiddling its thumbs in most games at 1440p and several games would only see around 75 to 80% GPU usage running uncapped and that's with a 9900k. And just because you have a high refresh monitor doesn't mean the GPU can actually fully utilize it as I saw games that just completely crapped themselves stuttering like mad when CPU limited. Something like Outer Worlds is a stuttering fucking mess trying to even hold more than 70 to 75 FPS in many areas of that game. You truly get a worse experience when you become CPU limited as opposed to GPU limited.
 
In some games that is true, but overall I didn't have that many issues. Maybe there were a few games that were heavily CPU bound (like Far Cry 5) but that was more rare.
 
I guess you haven't been following any rumors then because they're just going to increase that Infinity cache and stick with the same bus width. And the bus width is not even really important as it's the overall performance that you get from the GPU that matters. Most people even with higher end gpus are at 1440p not 4k so if anything AMD is the better performer for many due to the poor scaling and overhead that Nvidia has with ampere. It's just silly that I see so many people running a 3090 at 1440p especially with an older CPU. Hell I've even seen a few idiots running 3090s and 3080 Tis at 1080p.
I run a 3090 on custom loop with a 1440P screen. Because I want to turn on every single option, ray tracing as high as I can if applicable, and still have 100FPS+. 1440P is still the sweet spot for that; everything on, perfectly stable performance. 4K you have to turn a few things down sometimes or use DLSS. Also, not a good 32ish” 4K screen out with the specs I want for a reasonable price: the god monitor from Asus is really the first, and that sucker is 5k. Big TVs don’t count; I do other things in my system and 200% scaling to make text readable sucks.
 
I run a 3090 on custom loop with a 1440P screen. Because I want to turn on every single option, ray tracing as high as I can if applicable, and still have 100FPS+. 1440P is still the sweet spot for that; everything on, perfectly stable performance. 4K you have to turn a few things down sometimes or use DLSS. Also, not a good 32ish” 4K screen out with the specs I want for a reasonable price: the god monitor from Asus is really the first, and that sucker is 5k. Big TVs don’t count; I do other things in my system and 200% scaling to make text readable sucks.
Wow your eyes are worse than mine.👀
 
Wow your eyes are worse than mine.👀
4K screens 100% scaling is ~tiny~. I have 20/20 vision. Unless you’re sitting close, it’s not usable for coding/text/etc. and if you’re sitting close, the screen sucks for anything else like gaming. Screen real estate isn’t useful f you have to make everything big enough to see that it eats up the same amount of physical space or more than a smaller screen. Or are you moving 3-4 more ft away to play a game?
 
Covid made my eyesight much worse so I may even go 1440p when I get a 32inch screen...
 
What about 6900XT vs. 3090? For 90% productivity work? DaVinci Resolve, Premiere Pro, Lightroom, various NLEs with lots of plugins, etc.
 
DLSS is just a way to boost performance for HDR and specially if you are running out of VRAM which is like a bandaid for the RTX 3080 smaller VRAM compared to the RX 6900XT which tends to be faster in all scenarios except Ray Tracing in which depending on the game, can be almost as fast like in Resident Evil, Forza Horizon and Godfall (Some other games that are optimized for nVidia's way of handling RT will run noticeably worse in AMD hardware won't reach a point of being unplayable unless if you game at 4K which neither nvidia can't and hence, DLSS) Luckily AMD's FSR is gaining faster traction and benefits even old nVidia GPUs. And considering they use less power and are showing noticeable improvements in the performance with recent games that just underscores the 6900XT superiority over the RTX 3080
 
They’re both really really good cards. RT works better on Nvidia, FSR shows huge promise to potentially unseat DLSS (which works amazingly right now) and do so cheaper and efficiently, and nvidia has the unquestioned King (3090). You ain’t going wrong either way. Pick features that matter, or drivers you prefer, but we’re in a good time where both options are great. Hard to find, but great.
 
RT is the new G-Sync Nvidia needs something to cram down our throats marketing wise. Ultimately will you notice it on vs off probably not, are you going to keep it on in a high performance multiplayer game probably not. But hey you get to brag about how awesome Control and Cyberpunk look, woooooooooooow :/ Seriously surprised people are clinging onto RT as much as they are, gotta love fanboys would love to see tech reviewers do a wide scale Nvdia vs. AMD "blind" comparison test.
 
RT is the new G-Sync Nvidia needs something to cram down our throats marketing wise. Ultimately will you notice it on vs off probably not, are you going to keep it on in a high performance multiplayer game probably not. But hey you get to brag about how awesome Control and Cyberpunk look, woooooooooooow :/ Seriously surprised people are clinging onto RT as much as they are, gotta love fanboys would love to see tech reviewers do a wide scale Nvdia vs. AMD "blind" comparison test.
Don't people in this niche hobby buy these expensive graphics cards for those wooooooooooww moments though when technology advances? 4K, VR, RTX, etc.

This is kinda like complaining.. oh look at that guy in his yellow lambo, it's not like he's gonna drive it 150mph on the 65mph speed limit freeway.
 
would love to see tech reviewers do a wide scale Nvdia vs. AMD "blind" comparison test
This and in 2 parts, take say 100 people, 5 best implementation of RT-DLSS games

4 tests, trying to lock 60 fps:
1) Rt on
2) Rt off
3) Rt off pushing the graphic in a way to have the same FPS than Rt on
4) One test that is the exact same than the other 3, just to judge people saying that something look better than something else, how much is reel versus how much it is them trying to perform in blind test.

Same for RT+DLSS, DLSS alone.

We could do some 1080p vs 1440p vs 4K at the same time. One difficulty of the test cost wise would be that you arguably need to live a while in the game with RT on to feel something is off when it is turned off, the difference can be subtle, but like some CGI versus some real footage in movie, maybe some part of some brain feel it.

I am bit surprised if it has never been done.

NVidia invested a fortune in AI and it make a lot of sense to have unified production with the pro card being almost the same has the gamer card, need to find usage for it in game, regardless of the worthiness of it, which open the window. Relooking today at the 3d card world when the Voodoo 2 card got released, I feel like the giant amount of hype I had has a kid for it was not fully justified (TNT was a june 15 1998 release, vodoo 2 a february one).

Maybe you need HDR and my cheap 400 nits one and I think I am able to feel the difference, but I would like that experiment on the nicest hardware to see how much there is something there.

Was is nice for people that work with low cost regular budget, is that we have ridiculous optix rendering and other stuff a low price in that model, but it seem still up in the air how long it would take to be significant on the player side of things.
 
Last edited:
RT is the new G-Sync Nvidia needs something to cram down our throats marketing wise. Ultimately will you notice it on vs off probably not, are you going to keep it on in a high performance multiplayer game probably not. But hey you get to brag about how awesome Control and Cyberpunk look, woooooooooooow :/ Seriously surprised people are clinging onto RT as much as they are, gotta love fanboys would love to see tech reviewers do a wide scale Nvdia vs. AMD "blind" comparison test.
I notice it on, and yes, control looked amazing with it on. I don’t play multiplayer games anymore, it’s just another visual feature to enjoy making things look more lifelike or better. And before you call me a fanboy, I have both nvidia and amds latest cards.
 
You are only running a 60Hz screen...

So:
1) making that switch isn't going to gain you any performance, you are capped at 60hz.
2) you gain $100 - and your friend gets an amazing deal
3) you give up tech that you don't use

The calculation in all of this, is how long are you planning on keeping the card?
If it's more than 2 years, stick with the 3080. More and more RT games are coming out all of the time, and that is in fact the future of gaming. Even consoles are trying to (poorly) support it, but this also means more developers are being steered in that direction.
It it's less than 2 years, and you aren't recouping costs by mining, then you can get $100 without losing any performance.

Hopefully by the time you are ready to upgrade again, the GPU market and global supply will have returned to some sense of normalcy. If it has, you should be fine with either choice. If it hasn't, you will probably end up either being glad you stuck with the 3080, or wishing you had if you don't.
 
This and in 2 parts, take say 100 people, 5 best implementation of RT-DLSS games

4 tests, trying to lock 60 fps:
1) Rt on
2) Rt off
3) Rt off pushing the graphic in a way to have the same FPS than Rt on
4) One test that is the exact same than the other 3, just to judge people saying that something look better than something else, how much is reel versus how much it is them trying to perform in blind test.

Same for RT+DLSS, DLSS alone.

We could do some 1080p vs 1440p vs 4K at the same time. One difficulty of the test cost wise would be that you arguably need to live a while in the game with RT on to feel something is off when it is turned off, the difference can be subtle, but like some CGI versus some real footage in movie, maybe some part of some brain feel it.

I am bit surprised if it has never been done.

NVidia invested a fortune in AI and it make a lot of sense to have unified production with the pro card being almost the same has the gamer card, need to find usage for it in game, regardless of the worthiness of it, which open the window. Relooking today at the 3d card world when the Voodoo 2 card got released, I feel like the giant amount of hype I had has a kid for it was not fully justified (TNT was a june 15 1998 release, vodoo 2 a february one).

Maybe you need HDR and my cheap 400 nits one and I think I am able to feel the difference, but I would like that experiment on the nicest hardware to see how much there is something there.

Was is nice for people that work with low cost regular budget, is that we have ridiculous optix rendering and other stuff a low price in that model, but it seem still up in the air how long it would take to be significant on the player side of things.
Too many reviewers are wasting time with canned benchmarks and not even looking at the game or even playing it. It can be laughable when you see some of the reviewers playing a game and how poorly they play meaning they have not even played it while toting a given line. My views:

RT in BFV is so out of place with shinny mirror surface cars and Windows, floors etc. in a war zone -> Amazing cleaning must be happening in that game and makes virtually zero difference in playability, depending upon hardware it can degrade the experience.​
RT in Control, nice! More for the lighting and color bleed but the game itself IQ wise, textures, complexity of objects and so on is not the best compared to other non RT games which have better overall IQ. Love the game play which to me made the game, if I did not have RT hardware I think I would have enjoyed the game just as much play wise. 6900 XT with RT sucks while the 3090 kicks ass. DLSS had too many artifacts for me to use and was basically not needed anyways.​
RT in Metro Exodus, looks great while I have not completed the game it adds to the environment and depth. It also played fine with RT on with the 6900XT at 1440p but better with the 3090. DLSS works great in this title as well.​
RT in Doom Eternal, very nice, while just reflections, ID did not overdo it and it adds depth and complexity to the game with some very nice artistic backgrounds objects and structures. Plays great on either the 6900XT or 3090. DLSS also works well, but performance is so fast to begin with it is not really needed. In my case I am rendering at 5K, DLSS Quality on 1440p monitor for a slight IQ increase. Game looks and plays amazingly.​
Shadow Of The Tomb Raider, while many don't notice much of a difference with RT Shadows, I do. The more distant shadows are correct and not low mip-mapped shadow maps disconnected at times from the object with jaggies. If you know what you are looking at, it becomes very obvious. The 6900 XT had no issue either running it while not as fast as the 3090, it gave game play in which in a blind test I would probably not be able to tell the difference between the 3090 and 6900XT. Sometimes FPS means little once you are smooth and consistent.​
 
What about 6900XT vs. 3090? For 90% productivity work? DaVinci Resolve, Premiere Pro, Lightroom, various NLEs with lots of plugins, etc.
3090, I have both. I actually think the 6900xt is better in some gaming circumstances but if it’s not what you’re doing.
 
What about 6900XT vs. 3090? For 90% productivity work? DaVinci Resolve, Premiere Pro, Lightroom, various NLEs with lots of plugins, etc.
Possibly better with a 2080 super/3060ti than a 6900xt for most of those

https://videocardz.com/newz/amd-rad...geforce-rtx-3060-ti-in-davinci-resolve-studio

RX6900-XT-PugetBench-Overall-Score-768x759.jpg

pic_disp.jpg

pic_disp.jpg
 
I've now sold my 3080 and am dithering between a 6900x for ~£1200 or an RTX 3080 Ti for £1300. Decisions, decisions...
 
No brainer.
If the 3080ti was around at launch of 30xx I would have one instead of a 3090.
Dont get me wrong, I love the 3090 but I'm not normally a Titan purchaser.
I have a 3090 but I’d take the 3080ti over it for most things.
 
  • Like
Reactions: Nenu
like this
I'd have sold my 3090 off and kept the 3080 Ti FTW3 I was in the queue for if it were not LHR, though then again if that were the case, market values between the two would be just $200 apart and not the current $800 since the lower VRAM amount actually runs cooler for mining, engineering samples of the Ti were easily doing 122 MH/s at 270w.
 
Why? Is that just because of the lower cost of the 3080 Ti?
For me, it would have made sense in November when pricing was normal.
I havent played a game that needs 12GB yet, 24GB may have a place but not right now.
And 24GB ram temps are a bit wild.
 
I have both a 6900xt and an Nvidia RTX Titan. Unless you're dead set on DLSS & the NVidia version of the new bar feature, there is really no difference for me. Granted I have not had time for some of the newer games since the beginning of my school year my full time job. I can't see a difference in Battlefield V but a major difference on Metro compared to an AMD card. Metro just coded teh game better and RT and DLSS are flawless. They are both great cards. For me, it just comes down to which one will give you the best long term investment and enjoyment. I love both, but AMD won out this time. I scored my 6900xt on Ebay for $1299...a super lucky grab.
 
Back
Top