The Current Dilemma: RTX 3080 vs 6900 XT

harmattan

Supreme [H]ardness
Joined
Feb 11, 2008
Messages
5,129
I know this question has been out there in the broad domain, but I'm trying to figure out the consensus of the fine SMEs at [H]... I have an opportunity to swap out my RTX 3080 for a 6900 XT, likely recouping $50-100 in the process. Worth the trade? A few points that may influence the decision:
  • I'm running a 4k 60hz screen
  • I have a passing interest in Raytracing, but there are only 2-3 games I've played/am playing that leverage RT
  • I don't really care for DLSS, looks like applying an astigmatism to me and would rather scale back settings to run "clean"
  • I no longer manufacture "value" by wasting energy/resources since 2014/ROI is no longer splendid -- sitting on my wallets until proof of stake at this point
  • I have a good-quality 850w PSU
  • I'm running a R7 5800 @ 4.5ghz
 
I would definitely make the switch if you don't care about DLSS and are getting some money out of it. The only reason I can think of to take a 3080 over a 6900xt is if you love DLSS or if you plan to mine crypto.
 
I would tend to agree with Andrew. It's a faster card and has more memory which in the long run might be more important.
 
DLSS and RT performance will be crucial moving forward...depends on the games you play I guess. You still can't enjoy CP2077 fully without a 3080/3090...not sure about the market atm but I managed to move up from a 3080 to a 3090 for about 200 out of pocket, that's the trade I'd advise. 3090s will outlive the 6900xt, more VRAM and much much better RT perf.
 
Last edited:
If you don't want to use any of the features of the 3080 and the games you're playing would get some appreciable fps gain while you make a couple hundred bucks, go for it. I would try to get more than 50 to 100 bucks, assuming it's non lhr (and even lhr cards can be used for altcoins) because I assume that's why the other party is looking to trade for one and they'll be making more than that in less than a month at current rates.
 
If you don't care about RT or DLSS, the RX 6900 XT will be the better card.
 
No brainer. Get the 6900XT.

Ray Tracing isn't wall its cracked up to be & AMD has their version of DLSS now.
 
In the same boat - only 3080 or 6800XT. I have a spare card, and a friend that wants to buy one from me. Both top-end versions of their type - it's for a secondary system for me (5120x1440), and he's at normal 1440P. Leaning just slightly towards the 3080 because it's ~in~ the system already, while the 6800XT is in a box fresh back from RMA.
 
I'd actually touch a different note and try to change that 60 Hz monitor if you're gaming on any intense scale. That UHD 60 Hz panel likely has no sync or a garbage sync range of from 45 to 60, it'll be all over the place if you ever dip frames. Even at good FPS, it's probably still just sluggish compared to a 100 Hz+ monitor.

6900 XT trade would be cool though. Also, as you said, DLSS is useless with those cards. All it does in Warzone is making things blurrier for me. It's a lot of power and you can use more proper forms of AA.
 
Guessing no one here has played Control with DLSS + RT? :)
Yes - it's amazing - but it's also only one game; most things still don't have RT yet. CP2077 and Control are the big hitters - I didn't notice much on Doom Eternal worth turning it on. And we don't know how well FSR will spread, but RT works pretty well on Control on a 6800XT - if they get FSR working on it, performance will be great too.
 
I went from a 2080 Ti to a 6800 XT and overall I'm happy with the switch. In traditional games there is definitely an improvement, or at the very least equal.

Sadly ray tracing on AMD isn't all there. It does work in some games (mostly AMD sponsored games like Dirt 5) but in Cyberpunk, for example, it is basically unplayable. Like 15 fps. At least on Nvidia there is DLSS to make RT viable.

But if you don't care about RT/DLSS, then the AMD will likely be fine for you.
 
I have an opportunity to swap out my RTX 3080 for a 6900 XT, likely recouping $50-100 in the process
Getting a "$300 more expensive card" while making money instead of having to pay in the process, while having little interest in rtx (and I imagine the other AI stuff that come from Nvidia broadcast, blender denoiser Optix and so on).

Sound like a non brainer, if it does not cause issue warranty wise.

That said if you stay with your 60 hz monitor it could be a horizontal move, does it goes under 60fps without RTX on a 3080 often ? If that the case, if it make the warranty harder to work in anyway it could be something to think about.
 
Depends on the game (this is highly depending on the game). It also depends on how you set up games.

Example does your game work well full windowed mode on amd vs nvidia (more of an issue if your fps limiting the game below vsync, some games I play, they do not frame limit in windowed borderless on my rx 6800, no matter what I do) Less of an issue at 4k 60hz.

Overall though, I'd probably go with the 6900xt in your setup. As someone said though, warranty is a big thing, so make sure to put that into your decision making.
 
6900XT has treated me well. To note, this is on 1440p ultra wide tho. That said, admittedly, I would have snagged a 3080 had one been available back in Jan.

Time will tell regarding DLSS & RT vs FSR.
 
Last edited:
Yes - it's amazing - but it's also only one game; most things still don't have RT yet. CP2077 and Control are the big hitters - I didn't notice much on Doom Eternal worth turning it on. And we don't know how well FSR will spread, but RT works pretty well on Control on a 6800XT - if they get FSR working on it, performance will be great too.

Other great RT games I've played are Metro Exodus Enhanced edition, Shadow of the Tomb Raider, Call of Duty: Black Ops Cold War.
Its a wonderful addition, I love it.
Cyberpunk is just stunning!

Ray tracing with a decent HDR display are among the best things to happen to gaming.
 
Outside of the LG CX line I haven’t found a good HDR display yet; figure that will be reality late next year. The new G9 aside; that sucker is massive (I have the previous version).
 
Outside of the LG CX line I haven’t found a good HDR display yet; figure that will be reality late next year. The new G9 aside; that sucker is massive (I have the previous version).
My Q9FN TV and CRG9 ultra wide are great.
The CRG9 needs setting up well because of the lack of zones and is very impressive.
I'm spoiled by my Q9FN TV with HDR2000+much better dimming yet I much prefer playing games on the CRG9 with HDR on.

For quite some time HDR was broken by NVidia (black level was grey), many HDR games and displays got a bad rap due to that, but its fixed now.
Before it was fixed I couldnt properly enjoy games in HDR so turned it off.
Now its fab.
 
I have a 6900XT on a LG 4K FreeSync 60hz monitor -> It has no problems pushing past 60fps on a number of titles so I use Chill to keep it in the FreeSync Range. Currently playing Rise of the Tomb Raider all maxed out and it is beautiful, GPU runs around 85% loaded or less. I briefly played Metro Exodus on a 1440p 144hz HDR monitor and it played well with RT from what I remember (that system now has a 3090 in it). A 6900XT you will never run out of memory at 4K, the 3080 I would say you can push it enough that you can. As for DLSS which has been advancing, I would say FSR will advance as well since it is an open standard, developers can also push it to the next level and it is not solely dependent upon one company graces. Meaning it may advance at a quicker rate, get adapted at a much quicker rate than DLSS. Long term -> 6900XT, shorter term and mining -> 3080.
 
Something to bear in mind.
If you use DLSS or FSR, games render at lower res so use much less memory.
 
Hey Jensen, can I borrow your leather jacket?
I'm sorry you have a bias but it highlights why I made my decision.
The card I bought does what I want and very well. No bias involved other than it doing what I need it to, and being in a shop.
 
If you are going to keep your GPU for longer than 3 years. 6900XT may be a better GPU with 16GB of VRAM, if not, don't bother.
 
I have a 6900XT on a LG 4K FreeSync 60hz monitor -> It has no problems pushing past 60fps on a number of titles so I use Chill to keep it in the FreeSync Range. Currently playing Rise of the Tomb Raider all maxed out and it is beautiful, GPU runs around 85% loaded or less. I briefly played Metro Exodus on a 1440p 144hz HDR monitor and it played well with RT from what I remember (that system now has a 3090 in it). A 6900XT you will never run out of memory at 4K, the 3080 I would say you can push it enough that you can. As for DLSS which has been advancing, I would say FSR will advance as well since it is an open standard, developers can also push it to the next level and it is not solely dependent upon one company graces. Meaning it may advance at a quicker rate, get adapted at a much quicker rate than DLSS. Long term -> 6900XT, shorter term and mining -> 3080.
I think you've tipped me towards the 6900xt. I'm sure RT will show material benefits at some point, but I've really not been that impressed (including those games you list). Considering it's implemented in literally 3-4 games I play (2.5+ years after nV's promise of an RT revolution in gaming), and I don't even find it that fantastic, it's hardly a driver for my GPU choice.

Again, DLSS is a detriment IMO: I'd rather lower resolution and have the game run clean. And I don't mine, never will again.
 
Today the driver for GPU choice is poor availability and how much inflated are the prices.
If something is inside PC and works and there is nothing to gain* from such exchanges then I do not see the point other than continuation of RTX rant from 2018
Pulling and moving cards can be potential point of failure. Do not recommend.

*) Prices of GPUs are so ridiculous that 100 bucks is nothing
And it is not even like all games run better at 4K without RT on 6900xt. It is hit or miss and averaging.
 
*) Prices of GPUs are so ridiculous that 100 bucks is nothing
And it is not even like all games run better at 4K without RT on 6900xt. It is hit or miss and averaging.

If I could run (the most intensive) old, non-updated engine version of MSFS with my 2080 at 3440x1440 playable, I can ensure you there is not a single game (outside of modded stuff) a 6900 XT won't be able to run at 60 Hz UHD. I also aim for 100 Hz / higher than 60 FPS on my other titles.
 
I think you've tipped me towards the 6900xt. I'm sure RT will show material benefits at some point, but I've really not been that impressed (including those games you list). Considering it's implemented in literally 3-4 games I play (2.5+ years after nV's promise of an RT revolution in gaming), and I don't even find it that fantastic, it's hardly a driver for my GPU choice.

Again, DLSS is a detriment IMO: I'd rather lower resolution and have the game run clean. And I don't mine, never will again.
My experience with DLSS seems to reflects yours, only game so far I am using DLSS is Doom Eternal rendering DSR @5K with Quality DLSS on that 1440P 144hz HDR monitor. It gives slightly faster fps and about the same quality as DSR 4K at 1440p. Every other game except for maybe Metro was not a good experience. Once a good 120hz+ 4K monitor at a reasonable price comes about I will upgrade the monitor for the 3090.
 
WD Legion is another game where RT+DLSS makes a big difference. Assuming MSRPs were applicable, the 6800 is the only card this gen that is a clear winner over the 3070/Tea :) .
 
My Q9FN TV and CRG9 ultra wide are great.
The CRG9 needs setting up well because of the lack of zones and is very impressive.
I'm spoiled by my Q9FN TV with HDR2000+much better dimming yet I much prefer playing games on the CRG9 with HDR on.

For quite some time HDR was broken by NVidia (black level was grey), many HDR games and displays got a bad rap due to that, but its fixed now.
Before it was fixed I couldnt properly enjoy games in HDR so turned it off.
Now its fab.
Allow me to rephrase - outside of the CX (or other massive TVs, which I don't particularly find useful for anything but gaming or watching TV on), I've yet to find a good HDR 4k high resolution screen that ~I~ like for a reasonable price (The new god monitor from ASUS aside) that is good for everything (which is what I buy). The G9 I have (same size as your CRG9, just the less-gaming more-work focused version I think? Or maybe it's just the G-sync compatible version ~shrug~) is about as big as I could stand, since you can still use 100% scaling on it (big 4k I tend to need 200%, so you lose effective screen real estate, or you're sitting too close to make it really practical, imho). The PG line from Asus is good and close, but has some blurring with HDR and high refresh that I've seen (I'm using their 1440P screens at 120Hz, just no HDR since it, well, sucks on the lower end ones and adds blurring on the higher end).

I'm waiting for a 32-34" 16:9 HDR1000+ display with 120-144hz and no blurring, for 1200-1500. We're really damned close (and the new Neo G9 shows it's right around the corner!) - I figure next year I'll upgrade that system to whatever comes out. I'm using 27" 1440P screens now for everything but my HTPC (4k 60hz) and my work workstation (the G9).

edit:
Basically, I want this (https://www.asus.com/Displays-Desktops/Monitors/ProArt/ProArt-Display-PA32UCG/) - just a bit lower in price :p 5k is a lot of money for a screen.
 
Something to bear in mind.
If you use DLSS or FSR, games render at lower res so use much less memory.
I don't recall dlss making any meaningful difference at all VRAM usage.
 
Well the intermediate buffers are lower res, but the final back buffer is still at native res. So I'm not sure there would be a huge VRAM savings.
 
Well the intermediate buffers are lower res, but the final back buffer is still at native res. So I'm not sure there would be a huge VRAM savings.
Rendering at higher res uses more memory.
Works well in reverse.
 
In the same boat - only 3080 or 6800XT. I have a spare card, and a friend that wants to buy one from me. Both top-end versions of their type - it's for a secondary system for me (5120x1440), and he's at normal 1440P. Leaning just slightly towards the 3080 because it's ~in~ the system already, while the 6800XT is in a box fresh back from RMA.
The only reason here to go with the 6800XT---is simply to have something different. Performance is more or less a wash. And Nvidia wins on extra features such as Better RT performance, DLSS support, Nvidia Broadcast, industry leading video encoder quality.
 
Well the intermediate buffers are lower res, but the final back buffer is still at native res. So I'm not sure there would be a huge VRAM savings.
And because the texture need to stay high res, some game will even see an very small increase I think (with DLSS overheard), but some seem to go down

Maybe that all the previous step stage (geometry and shading that occur before in the flow) where in some case a decline can occur?

https://static.tweaktown.com/conten...ow-does-hideo-kojimas-game-run-at-8k_full.png
hmarked-how-does-hideo-kojimas-game-run-at-8k_full.png

Cyberpunk at 4K with RTX on seem to be around 9.6gb without dlss (at 2:25):


Down to 8.1 at DLSS quality and 7.3 at DLSS performance, while metro exodus geos up by 300 mb with DLSS enabled
Fortnite specialy with DLSS performance goes from 8.2 GB to 6.3 with RT maxed, while call of duty use 20GB no matter what.
 
Back
Top