AMD Radeon 6900 XT Review Round Up

Hardware unboxed review

4K, high quality, cost by fps (well has long as msrp cost mean something...)

3070..: $6.84
6800..: $6.90
6800xt: $6.98
3080..: $7.14
6900xt: $9.80
3090..: $13.88


at 1440p:
3070..: $4.00
6800..: $4.05
6800xt: $4.14
3080..: $4.57
6900xt: $5.91
3090..: $9.09


I feel like it open the door wide open for a 20gig 3080TI at $800 to be best buy of the biggest card (but I could see AMD overtime being able to price the 6900xt to beat it) at that price point it is hard to pin point what that cards is aiming at, the crowd that buy expensive motherboard to game that just want all the most expensive everything will probably go for the 3090, if value is any consideration the 6800xt look like such a non brainer.


There is a specific scenario, very wealthy, that the small difference make it so it is always over the 44 (or other limit) the monitor has for is VRR versus not being able to look their freesync make it having a very large value despite being a small one I imagine.
 
Last edited:
  • Like
Reactions: DPI
like this
This it is a bit like feeling sorry for someone that just bought new fancy tire's for their porsche and a new better model just got out.


If some young person is putting that kind of money on a credit card and struggling to pay it, I think we can feel sorry even if it was a nice +30% performance over the $700, but I imagine they are all very rich people.

People buying a 3090 know that they’re not getting a good deal, they know that it’s not worth spending more than double the alternative for a marginal gain. That is the “I want the best no matter what” demographic. If they don’t have enough disposable income and have to throw it on a credit card that they’re struggling to pay, then they need to get their priorities straight, but I doubt this represents any significantly measurable number of customers.
 
Good chuckle from post.Last vacation taken was over $10,000.Personal attendant for the week bla bla bla.
More enjoyment from fiddling around with pc parts.Cheaper also.

That doesn’t change the fact that the 3090 is a pointless card for gaming by any realistic measure. If you want to throw $1500 down on it because you take $10,000 vacations with your personal attendant, that’s totally up to you. This still does not make it a rational purchase given the alternatives. You can make an argument that it’s useful for non-gaming tasks at that price point in a similar way that the Titan was, but Nvidia is marketing this as a gaming card, so we have to measure it through that lens, and through that lens, there is no justification for anyone to throw down $1500 when they can throw down $700 at a cost of 10% of the performance. If AMD comes out with a $1000 alternative, I’m not going to be sympathetic because it was never a rational financial decision to begin with.
 
Was hoping AMD might have killed it with this, and while for the price it’s an amazing piece of hardware, I was hoping they would have taken the performance crown for once. I’m still Happy with my RTX 3090 purchase. Yeah, it’s more expensive, but this is [H] and it’s still the fastest.

I can admit I didn't want the 6900 XT wrecking my 3090. I'm happy it came close. :)
 
I'll keep an eye on the 6900 XT's the new Threadripper's aren't out yet so I have time to wait, not that I could buy at 6900XT even if I wanted too atm they are gone everywhere. But as it stands I will have a tough choice between them and the 3090, it's going to come down to a few options I'll have to see what some things scale with better overall.
 
I'll keep an eye on the 6900 XT's the new Threadripper's aren't out yet so I have time to wait, not that I could buy at 6900XT even if I wanted too atm they are gone everywhere. But as it stands I will have a tough choice between them and the 3090, it's going to come down to a few options I'll have to see what some things scale with better overall.

I was going to give getting one the old college try (fully expecting not to get one) but I got busy with work and didn't realize today was the day, oops, so I guess I didn't even get to try.

I guess the old Pascal Titan X will have to continue to solider on for a while longer. :/

It's kind of frustrating how badly everyone has dropped the ball this gen. Something like this could tempt a third party to enter the market, as the current players have obviosuly become too complacent, if not for the fact that the barriers to entry are too damn high.
 
That doesn’t change the fact that the 3090 is a pointless card for gaming by any realistic measure. If you want to throw $1500 down on it because you take $10,000 vacations with your personal attendant, that’s totally up to you. This still does not make it a rational purchase given the alternatives. You can make an argument that it’s useful for non-gaming tasks at that price point in a similar way that the Titan was, but Nvidia is marketing this as a gaming card, so we have to measure it through that lens, and through that lens, there is no justification for anyone to throw down $1500 when they can throw down $700 at a cost of 10% of the performance. If AMD comes out with a $1000 alternative, I’m not going to be sympathetic because it was never a rational financial decision to begin with.
lol, I bought mine for gaming and damn its good!
I was after a 3080 but they cannot be found, 3090s have been in stock to actually purchase, gasp!
The 3090 costs a lot more but has enough benefits to tie me over for 4 years instead of 2.
The cost per year isnt much different.
The first 2 years give a definite performance advantage, the last 2 years will probably show slight performance detriment compared to newer cards coming out.
But will still hold up very well compared to xx80 cards which is my target spend.
And I dont have to sweat it out trying to buy a card in 2 years, worth a lot to me.

Thats pretty rational imo ;)
 
lol, I bought mine for gaming and damn its good!
I was after a 3080 but they cannot be found, 3090s have been in stock to actually purchase, gasp!
The 3090 costs a lot more but has enough benefits to tie me over for 4 years instead of 2.
The cost per year isnt much different.
The first 2 years give a definite performance advantage, the last 2 years will probably show slight performance detriment compared to newer cards coming out.
But will still hold up very well compared to xx80 cards which is my target spend.
And I dont have to sweat it out trying to buy a card in 2 years, worth a lot to me.

Thats pretty rational imo ;)
I need the memory 10 was too small 16 is about perfect, 24 is just fun. But a few things I do just work SO much better with CUDA so come June/July unless there is some sort of shakeup I think the 3090 is where I am going. There are a few of the old blower models kicking around that would be delightful.
 
  • Like
Reactions: Nenu
like this
Not when you get RT involved.

Within a year, more-demanding RT games are going to all be slower-performing than the (cheaper) 3080.

The target-market of this card is really hard to define (same vram as 6800, so no real $300 value there).
I don't see RT mattering much within a year, from game quantity, PQ or acceptable performance perspective.
 
Looks like my local MicroCenter never got any :p


PXL_20201208_205604863.jpg


Edit:
Whoops, that wound up being a little larger than expected.

It's very deceptive when you upload from your phone.
 
Last edited:
Hardware unboxed review

4K, high quality, cost by fps (well has long as msrp cost mean something...)

3070..: $6.84
6800..: $6.90
6800xt: $6.98
3080..: $7.14
6900xt: $9.80
3090..: $13.88


at 1440p:
3070..: $4.00
6800..: $4.05
6800xt: $4.14
3080..: $4.57
6900xt: $5.91
3090..: $9.09


I feel like it open the door wide open for a 20gig 3080TI at $800 to be best buy of the biggest card (but I could see AMD overtime being able to price the 6900xt to beat it) at that price point it is hard to pin point what that cards is aiming at, the crowd that buy expensive motherboard to game that just want all the most expensive everything will probably go for the 3090, if value is any consideration the 6800xt look like such a non brainer.


There is a specific scenario, very wealthy, that the small difference look it so it is always over the 44 (or other limit) the monitor has for is VRR, I do imagine.
This is a nice breakdown. I wonder how the 3060ti stacks up against the 3070 in cost by fps.
 
I don't see RT mattering much within a year, from game quantity, PQ or acceptable performance perspective.


So you don't discount the fact that the $300-more-expensive card is copying the same pointlesss RTX 3090 trick (even wirse because they add no more VRAM?)

It's %5 higher performance for $300.

Also, be sure that you wait to call RT pointless until you see big tittles like Cyberpunk launched. Within 12 months,the 6900 XT will bedcomes the world's gratest regret purchase.
 
So you don't discount the fact that the $300-more-expensive card is copying the same pointlesss RTX 3090 trick (even wirse because they add no more VRAM?)

It's %5 higher performance for $300.

Also, be sure that you wait to call RT pointless until you see big tittles like Cyberpunk launched. Within 12 months,the 6900 XT will bedcomes the world's gratest regret purchase.
A great advantage the 3090 has atm is, you can buy it lol.
 
So you don't discount the fact that the $300-more-expensive card is copying the same pointlesss RTX 3090 trick (even wirse because they add no more VRAM?)

It's %5 higher performance for $300.
I made no comment on 6900XT's value proposition. It's a pointless card to purchase, and it's a card AMD needed to make.
Also, be sure that you wait to call RT pointless until you see big tittles like Cyberpunk launched. Within 12 months,the 6900 XT will bedcomes the world's gratest regret purchase.
I don't have to wait to know it will have too big of a performance hit for too little PQ improvement.
 
I feel like it open the door wide open for a 20gig 3080TI at $800 to be best buy of the biggest card

Nah, given the 6900 XT performance, if a 20 GB 3080 Ti comes to pass, it'll be at LEAST $999.

EDIT: Looks like Krenum beat me to it.
 
Average of 22 games @ 4K from the same source. 6900 XT is only 2% faster than the 3080. And only within 1% at 1080p and 1440p.

You can hope for performance to improve at a greater rate on AMD than Nvidia (historically they typically are), but there is no guarantee on that.
Its RT performance is around RTX 3070. If anything, it's only going to get worst over time.
 
Its RT performance is around RTX 3070. If anything, it's only going to get worst over time.
This right here is why you don't buy these AMD cards at those prices. If the $500 RTX 3070 performs faster in Minecraft Ray-Tracing than a $1k 6900XT then AMD fucked up. Ray-Tracing is going to be the future of games and ignoring the Ray-Tracing performance for $500+ graphic cards is just stupid. What's worse is that this $1k graphics card still comes with GDDR6 and not GDDR6X like the RTX 3080 and 3090, so higher resolutions tend to get worse. If AMD wanted a big win then they should have used GDDR6X.
 
This right here is why you don't buy these AMD cards at those prices. If the $500 RTX 3070 performs faster in Minecraft Ray-Tracing than a $1k 6900XT then AMD fucked up. Ray-Tracing is going to be the future of games and ignoring the Ray-Tracing performance for $500+ graphic cards is just stupid. What's worse is that this $1k graphics card still comes with GDDR6 and not GDDR6X like the RTX 3080 and 3090, so higher resolutions tend to get worse. If AMD wanted a big win then they should have used GDDR6X.
Tell me that in ~five years and I will give a damn. We are two generations of hardware away before we get real useable and meaningful ray tracing in more than a small handful of games.
 
This right here is why you don't buy these AMD cards at those prices. If the $500 RTX 3070 performs faster in Minecraft Ray-Tracing than a $1k 6900XT then AMD fucked up.
I imagine you is quite specific, it is a bit like saying blender render are often way faster on a 3060TI than a $1K 6900XT then AMD fucked up, I would not fall off my chair if I would learn that there is more Blender user in the world than people playing not just to test, but playing minecraft with ray-tracing on. There is a market out there of people that do not care about Blender performance when they buy a video card, same goes for Ray Tracing.

You do not buy the $1K 6900XT if ray tracing matter to you (say those cyberpunk RT on look better to you or an other example) and the sacrifice to get it seem a good tradeoff, for many that will not be the case and many game avoiding all of it a la Dark Soul (a bit like game that will concentrate into being playable at 60 fps without minding 4k really like Dark Soul) could end looking the greatest of this generation.
 
A good showing from AMD, they are definitely trading blows with the RTX 3090 for much less.

However, the overall uplift from the 6800 XT isn't amazing and the ray tracing perf seems lackluster.

Honestly, at this point, I'm just gonna sit tight with my 2080 Ti and see what the situation looks like next year.
 
A good showing from AMD, they are definitely trading blows with the RTX 3090 for much less.

However, the overall uplift from the 6800 XT isn't amazing and the ray tracing perf seems lackluster.

Honestly, at this point, I'm just gonna sit tight with my 2080 Ti and see what the situation looks like next year.
Wise decision. Everything about this year has been a clusterfuck. Hopefully Q2 of 2021 will be fruitful. I really hope so but I'm not 100% sure it will be.
 
A good showing from AMD, they are definitely trading blows with the RTX 3090 for much less.

However, the overall uplift from the 6800 XT isn't amazing and the ray tracing perf seems lackluster.

Honestly, at this point, I'm just gonna sit tight with my 2080 Ti and see what the situation looks like next year.

You should seriously consider selling the 2080 Ti right now if you can land one of these new cards. People are way over paying for them on eBay. I sold the two FE's I owned for $950 a piece.
 
You should seriously consider selling the 2080 Ti right now if you can land one of these new cards. People are way over paying for them on eBay. I sold the two FE's I owned for $950 a piece.
Good advice, but I want to wait until I can get a whole new build together. Right now it is impossible to get either a GPU or a CPU, and I would need both, and I'm not feeding the scalpers.
 
Good advice, but I want to wait until I can get a whole new build together. Right now it is impossible to get either a GPU or a CPU, and I would need both, and I'm not feeding the scalpers.
Yeah, originally I was going to just upgrade the two 2080 Ti's to a single 3090, but then I got upgrade fever and upgraded the CPU/MB as well. LOL That damn upgrade bug!!!!
 
Tell me that in ~five years and I will give a damn. We are two generations of hardware away before we get real useable and meaningful ray tracing in more than a small handful of games.
We're probably a good two years away before Ray-Tracing becomes more of a standard. That's how games have been historically after consoles are released. Xbox 360 was released in 2005 and in 2007 alone we get Bioshock, Portal, Halo 3, Mass Effect, Crysis, and etc. From that point forward we get games that make good use of the hardware. For the Xbox One and PS4 we didn't get games that made good use of the hardware until 2015, which is two years later. Witcher 3, MGSV, Batman Arkham Knight, Dying Light, Project Cars, and etc are all just in 2015 alone. We still benchmark with the Witcher 3 to this day.

Assuming COVID doesn't screw up the gaming schedule then 2022 will be a hell of a year for gaming. Good chance a lot of those games will be using Ray-Tracing. If the AMD RDNA2 cards are already struggling with Ray-Tracing on todays games then there's a good chance they may not be playable at 60fps in two years from now. Nvidia on the other hand has a good handle on Ray-Tracing, and if your RTX 3070 can't handle it then just turn on DLSS. You are paying $1k for a graphics card for what? To play todays games at or bellow 200fps? Watch Doom Eternal reach 300fps? Doesn't make sense to me.

I imagine you is quite specific, it is a bit like saying blender render are often way faster on a 3060TI than a $1K 6900XT then AMD fucked up, I would not fall off my chair if I would learn that there is more Blender user in the world than people playing not just to test, but playing minecraft with ray-tracing on. There is a market out there of people that do not care about Blender performance when they buy a video card, same goes for Ray Tracing.
The point of Minecraft is that it's implementation of path-tracing maybe how games ideally will handle Ray-Tracing in the future. How future proof are AMD's RDNA2 cards if Path-Tracing is difficult for them?
 
AMD's ML upscaling will come in 2021 and that might help with the ray tracing performance but, that's not ready now and I wouldn't make a purchase banking on it.
 
The point of Minecraft is that it's implementation of path-tracing maybe how games ideally will handle Ray-Tracing in the future. How future proof are AMD's RDNA2 cards if Path-Tracing is difficult for them?
If game use path-tracing in a near future, they are zero future proof, but like you said games will make good use of the new consoles (the RDNA2 console), that generation of games will for sure run without any problem on those cards because of that fact alone, there is nothing to worry about if you do not care for raytracing, it will either be optional or console compatible for all big title being developped that need the console market to justify their pricing, it will have nicher exception (that cannot run without raytracing on) but will stay niche almost certainly.

That's how games have been historically after consoles are released. Xbox 360 was released in 2005 and in 2007 alone we get Bioshock, Portal, Halo 3, Mass Effect, Crysis, and etc. From that point forward we get games that make good use of the hardware.
Something like Bioshock for example that using Unreal Engine that was in developpement a long time before a Xbox 360, how much relevance has the Xbox 360 on that final product ? How different would have been if that XBox 360 do not get released. I am missing the point you are trying to make, we know this generation of console do not handle Nvidia like Raytracing while (about a 2060/2060 super), I do not see them has a reason to predict to have way more than that level of raytracing capacity becoming mandatory, they will for major title have to work with a 2060s type of power in mind.
 
You are paying $1k for a graphics card for what? To play todays games at or bellow 200fps? Watch Doom Eternal reach 300fps? Doesn't make sense to me.

My - now 4.5 year old - Pascal Titan X struggles to reach 60fps in many titles in 4k at high settings.

And if you ever want to run MS Flight Simulator, well...


1607491865468.png


Not everyone runs games at low resolutions and poor quality. Based on the chart above, I think ideally might need something 35% faster than a 3090...

I switched to 40+ inch screens at 4k in 2015, and there is no going back it is so amazing. I want to render at that resolution without needing to sacrifice quality settings to get there.

I'll take any performance increase I can get, and almost any price premium (within reason, I'm not buying a Titan RTX pricepoint card), as long as scalpers don't get one red cent of it.
 
Last edited:
My - now 4.5 year old - Pascal Titan X struggles to reach 60fps in many titles in 4k at high settings.

And if you ever want to run MS Flight Simulator, well...


View attachment 307144

Not everyone runs games at low resolutions and poor quality. Based on the chart above, I think ideally might need something 35% faster than a 3090...

I switched to 40+ inch screens at 4k in 2005, and there is no going back it is so amazing. I want to render at that resolution without needing to sacrifice quality settings to get there.

I'll take any performance increase I can get, and almost any price premium (within reason, I'm not buying a Titan RTX pricepoint card), as long as scalpers don't get one red cent of it.
Eh? There were 4k screens in 2005? Must of been stupid expensive.
 
I run 3440x1440, which is less pixels than 4K, but at 160Hz I need all the performance I can get.

Even with a 2080 Ti, I'm usually around 90 - 120 fps, aside from games like Doom Eternal, which are super optimized.

Especially once you bring ray tracing into the picture, none of the cards today can handle that, even if you have the money.
 
Back
Top