Looking for feedback: 6900xt vs 3080

I have a 6900xt and was experiencing this so called “old game” or “light load stutter”. It drove me nuts until I spent some time in the software and shut off all the power saving/adaptive crap and just let the card crank. Voila it was fixed.

Same problem exists if you have a 3080/3090 class card. Try play e-sports style games and it’s a stutter fest because clock speeds keep dropping down to like 800mhz. You have to manually set prefer maximum performance for stuff with lighter loads to stop the constant clockspeed jumping.
 
The 6900xt "struggles" at higher resolution as much due to the smaller/cheaper bus width as memory speed.

100% concur, however, on your assessment. Cost being equal, if you don't want or play games with RT, 6900 xt is the best overall choice (even at higher resolutions). If you want raytracing (or DLSS which in my view is a degradation to IQ), get the 3080. Also, that Auros is definitely a step above that stock Zotac cooling solution and build, so that's also an advantage.

In any case, I've owned both cards, and the experience is quite close. Only reason I moved to a 3080 ti instead of sticking with a 6900 xt (or 3080) is I do play some games where RT shows some value, albeit small e.g. CP2077, SoTR.

Edit:. I never saw any issues with mins or low-power states on my Red Devil 6900 xt some are claiming. FPS was rock solid for me at all times and maxed (across the 9-10 games I was playing on it).

This will be a much more interesting choice next gen, since AMD is rumored to take back the performance crown as they will be at the minimum doubling the infinity cache so high resolutions won’t be an issue any more. On the flip side, Nvidia will still have the advantage of dedicated RTX hardware. So if AMD is 10-15% faster in raster at 4K, would you go Nvidia for RTX or stick with AMD for more fps?
 
This will be a much more interesting choice next gen, since AMD is rumored to take back the performance crown as they will be at the minimum doubling the infinity cache so high resolutions won’t be an issue any more. On the flip side, Nvidia will still have the advantage of dedicated RTX hardware. So if AMD is 10-15% faster in raster at 4K, would you go Nvidia for RTX or stick with AMD for more fps?
Interesting indeed since we should see even more games with RT support, and better implementations. I'd predict if 1) AMD can best nV in raw rendering AND 2) they improve RT performance, AMD may be the best pick. If they can't significantly improve their handling of RT, I think they'll be at a disadvantage even if raw performance is good. But in the end, this is conjecture and getting a new-gen card will be like threading a needle on a jet ski and/or you'll need to go into indentured servitude to pay for one.
 
6900XT =/= 6900XTXH

Raytracing is lackluster this gen. Maybe in a couple of years. MAYBE.
Remember Physx?
RTX is lacklustre? Ok sure.....

Needless to say my upgrade lastw eek from 2070 Super to 3080 Ti FE was a huge one. 19fps on the 2070 Super to no less than 60fps on the 3080 Ti. Exact same settings: 3440x1440, Ultra + all RT on (Psycho RT Lighting), DLSS on balanced.. The 12700KF seems to breathe a better in this game too now, before the CPU usage was in the 30s with the 2070 Super but now the CPU usage is around 50-60% whilst both times the GPU was at 99%

This will be a much more interesting choice next gen, since AMD is rumored to take back the performance crown as they will be at the minimum doubling the infinity cache so high resolutions won’t be an issue any more. On the flip side, Nvidia will still have the advantage of dedicated RTX hardware. So if AMD is 10-15% faster in raster at 4K, would you go Nvidia for RTX or stick with AMD for more fps?

It's a no brainer really. RTX every time because free framerate boost with RTX enabled when used with DLSS.
 

Attachments

  • Cyberpunk 2077 Screenshot 2021.12.19 - 02.04.00.13.jpg
    Cyberpunk 2077 Screenshot 2021.12.19 - 02.04.00.13.jpg
    325.4 KB · Views: 0
  • Cyberpunk 2077 Screenshot 2021.12.19 - 02.05.56.31.jpg
    Cyberpunk 2077 Screenshot 2021.12.19 - 02.05.56.31.jpg
    282.4 KB · Views: 0
  • Cyberpunk 2077 Screenshot 2021.12.20 - 05.34.23.28.jpg
    Cyberpunk 2077 Screenshot 2021.12.20 - 05.34.23.28.jpg
    252 KB · Views: 0
  • Cyberpunk 2077 Screenshot 2021.12.20 - 05.30.17.48.jpg
    Cyberpunk 2077 Screenshot 2021.12.20 - 05.30.17.48.jpg
    258 KB · Views: 0
  • Cyberpunk 2077 Screenshot 2021.12.20 - 04.54.55.01.jpg
    Cyberpunk 2077 Screenshot 2021.12.20 - 04.54.55.01.jpg
    210.2 KB · Views: 0
  • Cyberpunk 2077 Screenshot 2021.12.21 - 00.58.52.52.jpg
    Cyberpunk 2077 Screenshot 2021.12.21 - 00.58.52.52.jpg
    276.5 KB · Views: 0
  • Cyberpunk 2077 Screenshot 2021.12.21 - 00.56.34.29.jpg
    Cyberpunk 2077 Screenshot 2021.12.21 - 00.56.34.29.jpg
    202.6 KB · Views: 0
  • Cyberpunk 2077 Screenshot 2021.12.21 - 00.58.07.22.jpg
    Cyberpunk 2077 Screenshot 2021.12.21 - 00.58.07.22.jpg
    233.1 KB · Views: 0
  • Cyberpunk 2077 Screenshot 2021.12.21 - 00.54.25.37.jpg
    Cyberpunk 2077 Screenshot 2021.12.21 - 00.54.25.37.jpg
    207.1 KB · Views: 0
  • Cyberpunk 2077 Screenshot 2021.12.21 - 00.49.51.13.jpg
    Cyberpunk 2077 Screenshot 2021.12.21 - 00.49.51.13.jpg
    259.8 KB · Views: 0
  • Cyberpunk 2077 Screenshot 2021.12.21 - 00.50.18.02.jpg
    Cyberpunk 2077 Screenshot 2021.12.21 - 00.50.18.02.jpg
    212.4 KB · Views: 0
Last edited:
RTX is lacklustre? Ok sure.....

Needless to say my upgrade lastw eek from 2070 Super to 3080 Ti FE was a huge one. 19fps on the 2070 Super to no less than 60fps on the 3080 Ti. Exact same settings: 3440x1440, Ultra + all RT on (Psycho RT Lighting), DLSS on balanced.. The 12700KF seems to breathe a better in this game too now, before the CPU usage was in the 30s with the 2070 Super but now the CPU usage is around 50-60% whilst both times the GPU was at 99%



It's a no brainer really. RTX every time because free framerate boost with RTX enabled when used with DLSS.
He’s not wrong or right. For a few games RTX is amazing. For others - eh? It all depends. It’ll matter a lot more in a year or two. Right now? Eh? I’ve got all the cards pretty much. I don’t notice a huge difference - my 3090 is faster than the 6800XT but they’re both absurdly fast at anything I do. I did play control on the 3090, but it ran just fine with RT on the 6800XT too.
 
I suppose RT in games like 2077 are noticeable to those who do look at visual things though. There is a marked difference in RT reflections for example and the subtle realism of shadow casting and light illumination on a global scale.

DF showcased it best, the differences are definitely noticeable, plus with DLSS enabled it's a free visual upgrade so why would anyone not have it if available?

 
I suppose RT in games like 2077 are noticeable to those who do look at visual things though. There is a marked difference in RT reflections for example and the subtle realism of shadow casting and light illumination on a global scale.

DF showcased it best, the differences are definitely noticeable, plus with DLSS enabled it's a free visual upgrade so why would anyone not have it if available?


Cause they have no notable interest in that particular game? Made a big difference for control, but also works fine on AMD with RT on or off there too. Never tried CP, but that’s a game I’ll get when it goes on sale for $10 to try.

As for the rest- plenty work fine on AMD with RT too. You might not get 150FPS anymore, but if it’s single player- I don’t always notice that much. Especially with freesync. All depends on the game. And if you’re doing things without RT, they’re about the same. Buying for a couple of games today may or may not make sense to someone. If you upgrade every generation, who cares? You’ll get the next set by the time most games have it anyway. If you keep it longer, it may matter more. YMMV and all that. Both are great cards today.
 
RTX is lacklustre? Ok sure.....

Needless to say my upgrade lastw eek from 2070 Super to 3080 Ti FE was a huge one. 19fps on the 2070 Super to no less than 60fps on the 3080 Ti. Exact same settings: 3440x1440, Ultra + all RT on (Psycho RT Lighting), DLSS on balanced.. The 12700KF seems to breathe a better in this game too now, before the CPU usage was in the 30s with the 2070 Super but now the CPU usage is around 50-60% whilst both times the GPU was at 99%



It's a no brainer really. RTX every time because free framerate boost with RTX enabled when used with DLSS.


Raytracing on and off in Far Cry 6.
The video is less than 2 minutes.
How much difference can you see?

No difference, not because the game has lackluster Ray Tracing implementation, but because it already has fantastic lightning. Meaning, you can get near Ray Tracing visuals without the massive performance impact.
 
That video refers to FC6, My post was in reference to Cyberpunk 2077 which is why I added screenshots in my original comment of this game. I have not played FC6 and do not know to what extent RT isutilised in that game but do know what RT technologies are used in Cyberpunk and there is a visual difference as the Digital Foundry video on youtube shows in their fine detail dive into the differences with it on and off.

Either way I have just watched that video and I'm confused, either FC6 uses RT all wrong, or the video creator got things the wrong way round. The video's RTX On half shows LESS reflections than the RTX off side. That doesn't make sense...

Not enough to justify scalped price of 3080Ti and the FE's 10GB ram is a deal breaker even at lower scalped prices.
This is only a problem if you pay scalped prices and if you also buy a 10GB card. I did neither, I paid nVidia MSRP and got a 3080 Ti FE, which are all 12GB.
 
My view is RT is just on the cusp of being relevant, especially if you play games that have e a solid implementation. If you plan on keeping your card for the next year (I know I am because of prices), then you should stretch to a 3080 ti or a 3080 (but only if you play <4k). If you plan on upgrading within a year i.e. next gen, it's a toss up: consider the games you play, cost, and availability.
 
That video refers to FC6, My post was in reference to Cyberpunk 2077 which is why I added screenshots in my original comment of this game. I have not played FC6 and do not know to what extent RT isutilised in that game but do know what RT technologies are used in Cyberpunk and there is a visual difference as the Digital Foundry video on youtube shows in their fine detail dive into the differences with it on and off.

Either way I have just watched that video and I'm confused, either FC6 uses RT all wrong, or the video creator got things the wrong way round. The video's RTX On half shows LESS reflections than the RTX off side. That doesn't make sense...


This is only a problem if you pay scalped prices and if you also buy a 10GB card. I did neither, I paid nVidia MSRP and got a 3080 Ti FE, which are all 12GB.
Even 12GB is low in today’s world. I mean it’s only 1 more GB than the 1080 Ti from 2016….if I were in the market today I wouldn’t get anything less than 16GB which means 6900xt or 3090. Anything else is mid tier class in 2 years or less
 
Even 12GB is low in today’s world. I mean it’s only 1 more GB than the 1080 Ti from 2016….if I were in the market today I wouldn’t get anything less than 16GB which means 6900xt or 3090. Anything else is mid tier class in 2 years or less
Even those GPUs will most likely be mid tier unless AMD and Nvidia bomb the next generation.
 
Even 12GB is low in today’s world. I mean it’s only 1 more GB than the 1080 Ti from 2016….if I were in the market today I wouldn’t get anything less than 16GB which means 6900xt or 3090. Anything else is mid tier class in 2 years or less
But those GPUs did not make use of VRAM as efficiently as modern cards like these do. There' SAM/Resize-BAR for starters, and the 30 series uses GDDR6X. More is always better, but it's not the be all and end all depending on the card class itself.
 
But those GPUs did not make use of VRAM as efficiently as modern cards like these do. There' SAM/Resize-BAR for starters, and the 30 series uses GDDR6X. More is always better, but it's not the be all and end all depending on the card class itself.
SAM/resize-BAR and gddr6x has nothing to do with having enough vram to store all the data needed to render frames without having to wait for main memory or page-file. If you had said today’s card have better delta compression then yes that helps, but no, what you said has nothing to do with vram capacity.
 
My context was based on efficiency, not capacity. Yes more is always better but a 1080 with 24GB VRAM will not be running games in any way shape or form better than a 8-12GB 30 series. It could be argued what if you game at 4K, well a 1080 24GB won';t be running any games at 4K because it's just not efficient or powerful enough anyway to begin with.
 
My context was based on efficiency, not capacity. Yes more is always better but a 1080 with 24GB VRAM will not be running games in any way shape or form better than a 8-12GB 30 series. It could be argued what if you game at 4K, well a 1080 24GB won';t be running any games at 4K because it's just not efficient or powerful enough anyway to begin with.
Okay I understand your argument now, apologies for the misunderstanding
 
Back
Top