AMD Radeon 6900 XT Review Round Up

We're probably a good two years away before Ray-Tracing becomes more of a standard. That's how games have been historically after consoles are released. Xbox 360 was released in 2005 and in 2007 alone we get Bioshock, Portal, Halo 3, Mass Effect, Crysis, and etc. From that point forward we get games that make good use of the hardware. For the Xbox One and PS4 we didn't get games that made good use of the hardware until 2015, which is two years later. Witcher 3, MGSV, Batman Arkham Knight, Dying Light, Project Cars, and etc are all just in 2015 alone. We still benchmark with the Witcher 3 to this day.

Assuming COVID doesn't screw up the gaming schedule then 2022 will be a hell of a year for gaming. Good chance a lot of those games will be using Ray-Tracing. If the AMD RDNA2 cards are already struggling with Ray-Tracing on todays games then there's a good chance they may not be playable at 60fps in two years from now. Nvidia on the other hand has a good handle on Ray-Tracing, and if your RTX 3070 can't handle it then just turn on DLSS. You are paying $1k for a graphics card for what? To play todays games at or bellow 200fps? Watch Doom Eternal reach 300fps? Doesn't make sense to me.


The point of Minecraft is that it's implementation of path-tracing maybe how games ideally will handle Ray-Tracing in the future. How future proof are AMD's RDNA2 cards if Path-Tracing is difficult for them?

Probably more than two years. The last gen consoles are likely to stick around longer this time around compared to previous gens. Even Horizon 2 will be on both the PS4 and PS5 and MS has committed to releasing first party games on the Xbox One and One X for the next couple years. It will be a while before games are designed solely around the PS5 and Series systems, which means things like RT are still going to get more minimal usage and support.

Even if you're right and RT starts getting more notable mid to late 2022, that will still be more than enough time for new cards to come out that will be better at ray-tracing, making purchasing anything right now solely for RT rather pointless. The whole concept of "future proofing" when it comes to PC hardware is ludicrous. You can sometimes do it with raw performance, but not when it comes to advanced features.

Edit: People should buy hardware for what they are doing now, not what may or may not happen two+ years down the road.
 
When its a new feature like RT I think there's some understanding that, especially with the exorbitant prices for mid to high end GPUs these days, they want to be somewhat 'future proof" on major features. Its one thing for next year's cards to be more performant, but for more than a decade you generally could buy a card close to its launch and generally it is able to play the latest games for quite awhile, with all features involved. It hasn't been since circa-2000 whereas you could buy some expensive GPU and literally either A) Have it barely able to play next year's games or B) Miss out on major features or performance elements. Ray tracing is the biggest potential "new thing" that requires semi-dedicated hardware and a software component, so I understand people focusing upon it - especially if it actually takes off. For what its worth RT to some degree is available now - people see the shiny shit in Watchdogs Legion or whateveer will be in Cyberpunk 2077 and will say "yeah, if I'm spending money I want that" if its noticeable.

A generation or two ago (ie even my 1070 ) you could pay around $700 for a top of the line 1080Ti (even an AIB version, or perhaps only a little bit more unless it was a waterblock edition) but now we have cards that are $1000 to even $1500 on the high end (and that's at standard pricing). I can understtand people wanting some degree of "future proofing" both in raw performance and in major features. Worse, its noteworthy that even the 6900XT and of all things the 3090 there are still some questions about being able to play things at high/max game settings, 4K resolution, and especially with raytracing enabled (for the moment only NV can use an accessible form of DLSS to make up some of the performance deficit ). While I'm really glad to see AMD's 6900XT (and the 6800XT which seems to be almost as good as its big brother) do so well especially in raw rasterized power, I admit that while I can't necessarily fault them too much with Nvidia having a lead for a generation, it appearing right now that AMD cards are only "okay" at raytracing and missing out entirely on DLSS (not to mention things like opengl support on WIndows and encoder support ), its of concern.

I'd really like to unreservedly buy AMD to stand against Nvidia's history of proprietary focus, but its frustrating when I'm not sure what AMD's "real" featureset will be. It may be that given a few months of driver and feature improvements, plus the whole host of next generation games that are cross platform with AMD RDNA2 powered consoles, that AMD will have something close to parity if not better for these current NV features; but I can't know that for sure. Honestly a part of me wants to support AMD so that NV will have less pull over the market (and any specifics or proprietary things they tend to favor), but I really want to see that AMD was thinking about this stuff. They HAD to know that RT would be important . They had to know that DLSS -like features need to be there given their massive effective quality and performance jump. They know that Nvidia is able to push things like their remote/streaming tech with using their encoders - AMD should favor an open alternative! On it goes etc.. Maybe AMD really is aware of all this and its just going to take a little time to really see what these Big Navi RDNA2 cards can do, but I hope theyaren't going to bury their head in the ground and say "We finally compete on rasterized performance and do it at a good value!" only to have the market moving another direction with other features as well.
 
This right here is why you don't buy these AMD cards at those prices. If the $500 RTX 3070 performs faster in Minecraft Ray-Tracing than a $1k 6900XT then AMD fucked up. Ray-Tracing is going to be the future of games and ignoring the Ray-Tracing performance for $500+ graphic cards is just stupid. What's worse is that this $1k graphics card still comes with GDDR6 and not GDDR6X like the RTX 3080 and 3090, so higher resolutions tend to get worse. If AMD wanted a big win then they should have used GDDR6X.

Honestly, none of these cards really do ray tracing well at this point. Even Nvidia makes you sacrifice a huge chunk of your frame rate for marginally better shadows. Ray tracing is a lot like anti-aliasing was in its early days right now. It was a cool feature to play with, but until the Radeon 9700 came out, you probably never used it because you couldn’t justify the frame rate hit. Nvidia is definitely a step ahead of AMD at this point, but it’s not as if either are really delivering a great experience with the feature right now.
 
For some reason I had much higher expectations of the 6900XT 🤷‍♂️
Great value when compared to the 3090, but when compared to 3080 or 6800XT, maybe not so much =/
I wonder what if anything nVidia will do to the 3090 pricing.
 
Not when you get RT involved.

Within a year, more-demanding RT games are going to all be slower-performing than the (cheaper) 3080.

The target-market of this card is really hard to define (same vram as 6800, so no real $300 value there).
but it is not usable yet (ray tracing) so it's moot... like Kyke pointed out at least 2 generations away
 
For some reason I had much higher expectations of the 6900XT 🤷‍♂️
Great value when compared to the 3090, but when compared to 3080 or 6800XT, maybe not so much =/
I wonder what if anything nVidia will do to the 3090 pricing.
Unlikely. Nvidia will just release the 3080 Ti which will have 99% performance of the 3090 and with 20GB VRAM for $999 (probably $1100-1200 for AIBs) to compete directly with the 6900XT. The 3090 will continue to be the “top end” product that will only compete with itself at $1500.

I expect Nvidia will be shifting most GA102 ASICs to the 3080 Ti very shortly making the 3090 harder to come by.
 
what do you consider usable frame rates and does it work equally well on all titles?
 
Was hoping AMD might have killed it with this, and while for the price it’s an amazing piece of hardware, I was hoping they would have taken the performance crown for once. I’m still Happy with my RTX 3090 purchase. Yeah, it’s more expensive, but this is [H] and it’s still the fastest.
I'm glad AMD is at least competitive and has a halo product.

"for the price it’s an amazing piece of hardware" except no, it isn't. That would be the 6800XT and the 3080. Lol @ our first world arguments... Most of the specs match the 6800XT exactly, 6900XT has 11% more compute units, 11% more shaders, and 11% more TMU's, but the memory bandwidth (bus width), VRAM (amount and speed), clock speed, boost clock, pixel rate, and ROP's are exactly the same as a 6800XT. And from reviews, it boosts to a lower clock speed by 50Mhz than the 6800XT, which is about 2%. The net result is 2% to 10% better performance than a 6800XT. So almost as good as Titan or Ti card improvements. Definitely the Halo product for the price premium.

For most*, the 6800XT for AMD fans, or 3080 for Nvidia fans make more sense.
If a 3080Ti comes out with 16 or 20Gb vram for $999 or less, 6900XT will be 'meh'.

What's bullshit is the un-availability. Why are the AMD's also so hard to find? They doing the miner shit too?

*(If you own a 3090 or 6900XT, this is not you. No need to reply justifying your decision, I was not talking about you)
 
I really don't get the whole "future proofing" argument, since nothing is "future proof". The situation changes every year, especially now with all the new features that constantly come out.

Maybe that was true at one point, when things were simpler, like with the 8800 GTX, yeah you got some years on that, but these days it's just not the case.

Makes more sense to me to buy what you need today, and then sell the card in 1 to 2 years and get whatever's new. And with the used market in these times you can typically get a good amount of money back.
 
To be fair, it's only an issue because of idiotic resolution increases. I'd be fine still with a 27'' 1080p monitor, personally, and if that were the case a 1080ti would still be monster for that resolution. However, if you want an updated monitor that has decent contrast ratio or features like HDR you aren't getting 1080p anymore.
 
For some reason I had much higher expectations of the 6900XT 🤷‍♂️
Great value when compared to the 3090, but when compared to 3080 or 6800XT, maybe not so much =/
I wonder what if anything nVidia will do to the 3090 pricing.

Nvidia won’t change the 3090 pricing for the same reason that Starbucks won’t stop charging $5 for a latte. People are buying every single one they can make, so why cut the price? I mean I don’t even see one of the previous generation video cards right now below their MSRP, let alone the new ones.
 
To be fair, it's only an issue because of idiotic resolution increases. I'd be fine still with a 27'' 1080p monitor, personally, and if that were the case a 1080ti would still be monster for that resolution. However, if you want an updated monitor that has decent contrast ratio or features like HDR you aren't getting 1080p anymore.

Admittedly, there's a significant difference between 1080p resolution and higher ones - many are using 1440p and have been for a couple years now, with 4K being the new thing. This is also affected by other display factors notably frequency (ie For the early days of LCD displays, 60hz was pretty much it except some maybe some overclocking to 75 - even if you increased resolution from 480p all the way up to 1080p etc. It wasn't until later that TVs and especially monitors started being able to run at higher frequencies that were "serious" like 90 / 120 / 144hz or above depending. FreeSync / GSync / VRR had a big role to play as well). HDR, image quality and backlighting etc.

Even today its hard to find a GPU capable of pushing performance at 4K resolution properly - its only this most recent generation - NV 3080 / 3090 , AMD 6800XT / 6900XT - that can really do it, but now there's a NEW hugely demanding possible gamechanger - Raytracing which makes even a $1500+ card struggle (and is somewhat mitigated by faking the resolution with DLSS ) . So it has been a long time coming, and its almost like the hills are getting steeper, the equipment is getting more expensive, and new cards aren't just a massive overhaul in standard performance but are competing with new features. Its not just about running things at 1080p at 300 FPS, or even getting decent 4K / 60 performance but running 4K / 120 (144) w/HDR and raytracing etc.

Perhaps in 2 GPU generations when there is (hopefully) a standardized raytracing methodology that is GPU and OS independent, all modern gaming-grade GPUs will more or less have a similar feature set and offer great performance at 4K+. Then again, we may also have a fractured setup of proprietary card focus using proprietary tech that are good at one thing or one implementation so long as the stars align.. and I'd rather not see that come to pass. Of course, we're already seeing TVs pushing 8K pictures and that's as big a jump in raw pixels over 4K as 4K is over 1080p!
 
I really don't get the whole "future proofing" argument, since nothing is "future proof". The situation changes every year, especially now with all the new features that constantly come out.
Honestly a 5-year-old 4c/8t PC has been future-proof this whole time. Maybe you can blame Intel for resting on their laurels, but it's only been in this last month that any new hardware has come out that dramatically shifted needs over to a new generation.

I can say for certain that my 7-year-old 4c/8t office PC was good enough even at 2K, being completely GPU-limited for gaming, until AMD pushed resizable BAR. I might be able to keep it going with a 6700 XT or 3060 Ti upgrade, but at that point I'll be leaving frames on the table.

Future-proof may not ever truly be achievable, but there are these nexus points where multiple performance thresholds and standards hold out for years, not months. Especially now, since consoles set the standards for PC gaming.

As much as I am nostalgic for the '00s and early teens-era of PC hardware boom, that constant surge in hardware performance gains really did peter off.

Which is why now, and more realistically, 2021, is so exciting again. We're getting off this plateau, but I expect to wind up on another one for a while afterward.
 
I really don't get the whole "future proofing" argument, since nothing is "future proof". The situation changes every year, especially now with all the new features that constantly come out.
Well it all depends, versus the late 80s/90s past computer in the 2010s have had quite the long practical life.

Imagine trying to game on the biggest 3x86 20mhz of 1990 in 2000 on a recent game versus trying to play on a 2010, 12 thread Core i7-980x on a recent game in 2020,

Or for how long a 980TI/1080TI will be an ok card to play at 1080p versus how a Geforce2 pro released in 2000 aged, it was already in 2003 when compared against a 5900: 80% less transistor, less than half the frequency, 80% less and 75% slower ram, it was 2 generation of DirectX behind and was about 4 time slower in Quake 3. During that time every 18-24 months actual doubling of performance could occur.

The argument could be the opposite, future proof now is something we can get with how slow things got has the industry mature (just look how long some people ran their 2500k-2600k system or for how long now people have built 16 gig of ram system), the 24 gig 3090 could be a very capable card for any game made with the Play station 5/Xbox series X in mind in 2027.

Now, is it a wise move (financially, fun, etc...) to try to make future proof buying choice, for people that optimise and resell stuff a lot I can see it, has for one the type of people that buy the newest very expensive stuff can be the type of person that always upgrade anyway and for who future proofing is the least important anyway as well in a sense, but in a other sense future proofing in term of resales value would still be an important variable for them.
 
Last edited:
Thanks LukeTbk, I could not agree more. Computers have far more lasting power today than in decades past because the reality is that software has not kept up with the pace of hardware development (compared to years past when last year's software was sluggish on next year's hardware) and this in spite of YoY improvements in hardware being a far cry from what we had in the 1980s and 1990s.

Further, when it comes to games, they are incredibly good at scaling to hardware. I'm sure I can get a passable Cyberpunk 2077 experience on a GTX 780 and X58 system as long as I meter expectations (resolution, eye candy) accordingly.

The demise of SLI/CrossFire IMO has created a certain uniformity (or alternatively, irrelevance) at the high end of GPUs... for most practical purposes, it doesn't really matter whether you get a 3080 or 3090 when it comes to future proofing. Ditto the 6800 and 6900. Now a 3080 vs two 3090s in SLI? That might have been a meaningful step up that stood out for some time to come, but it seems that model is all but dead.
 
To be fair, it's only an issue because of idiotic resolution increases. I'd be fine still with a 27'' 1080p monitor, personally, and if that were the case a 1080ti would still be monster for that resolution. However, if you want an updated monitor that has decent contrast ratio or features like HDR you aren't getting 1080p anymore.

Sorry, but I disagree. Moving from 1080p to 1440p was a huge difference for me. You’re close enough to the screen that higher resolutions actually matter. People say the same thing about “the human eye can only detect 60Hz”. I was fine with 60Hz, until I bought a 165Hz monitor. Now I find 60Hz painful to look at, like it literally causes me eye strain. There is an upper limit to where things make sense, but 60Hz/1080p is definitely not that upper limit.
 
I really don't get the whole "future proofing" argument, since nothing is "future proof". The situation changes every year, especially now with all the new features that constantly come out.

Maybe that was true at one point, when things were simpler, like with the 8800 GTX, yeah you got some years on that, but these days it's just not the case.

Makes more sense to me to buy what you need today, and then sell the card in 1 to 2 years and get whatever's new. And with the used market in these times you can typically get a good amount of money back.

You can’t future proof anything, but the situation most definitely does not change every year. That USED to be the case, but it isn’t any longer. My previous primary machine was in regular use for 6 years before I felt the need to upgrade (aside from the GPU, which I did after 5 years, and adding some RAM after Windows 10 made it apparent I could no longer get away with 8GB), and had I owned a 4770k instead of a 4670k, I’d quite likely still be running it aside from upgrading the video card.

I also personally find re-selling things to be a bit of a pain in the ass, but I normally run my systems long enough that they have little sell value remaining anyway, so one of my relatives ends up getting it as an upgrade so they can discard their old PC (which is often one of my older ones as well).
 
Sorry, but I disagree. Moving from 1080p to 1440p was a huge difference for me. You’re close enough to the screen that higher resolutions actually matter. People say the same thing about “the human eye can only detect 60Hz”. I was fine with 60Hz, until I bought a 165Hz monitor. Now I find 60Hz painful to look at, like it literally causes me eye strain. There is an upper limit to where things make sense, but 60Hz/1080p is definitely not that upper limit.
I'd say 1440 is a decent upgrade at 27'', and i'd be fine with that. The issue is that to get better panel technology all the panels at that size are now 4k minimum.

And again, personally, my perfect panel would be a 27'' 1080p HDR 1000 144hz+ 10-bit IPS panel. Unfortunately, you've got to jump all the way up to 4k to get the good specs.

Don't confuse my disdain for ultra high PPI for refresh rate. Each human is different. I get that. I personally can tolerate the PPI of 1080p on a 27'' panel. Although what I can't tolerate is low refresh rate, and shitty contrast ratio, etc.
 
I'd say 1440 is a decent upgrade at 27'', and i'd be fine with that. The issue is that to get better panel technology all the panels at that size are now 4k minimum.

And again, personally, my perfect panel would be a 27'' 1080p HDR 1000 144hz+ 10-bit IPS panel. Unfortunately, you've got to jump all the way up to 4k to get the good specs.

Don't confuse my disdain for ultra high PPI for refresh rate. Each human is different. I get that. I personally can tolerate the PPI of 1080p on a 27'' panel. Although what I can't tolerate is low refresh rate, and shitty contrast ratio, etc.

Fair point. Unfortunately, panel companies crank the resolution because it’s the easiest variable to solve and the easiest to market, even though it’s not always the most important for visual fidelity.
 
Back
Top