AMD 7900XT and 7900 XTX Reviews

There is a lot of backtalk about a respin of N31, so it would be prudent to hold of jugdement as I have doubts we have seen the full capabilities of the N31 if that holds true.
While I am not 100% sure on this, I would not hold your breath waiting for that to happen. A lot of talking heads saying talking heads things.
 
True, but that doesn't change the fact that Cryengine supports RT without the use of dedicated RT hardware. It's an interesting solution.

Also, the Crysis Remaster has some of the same technology in it.
Cryengine isn't relevant anymore, and Neon Noir was already debunked as bogus with the dev admitting a few months afterward that some things were prebaked. Theres a reason that demo is almost 4 years old with not a peep since.

And nobody disputes RT can be done without specialized hardware. That was the counterargument to RTX in early 2019 - "Neon Noir can do RT without RTX, therefore RTX must be Nvidia PhysX 2.0 snakeoil!" The goalposts have moved in the interim.

Specialized hardware will just do RT faster, but that was always the case. RT isn't zero sum and Nvidia doesn't have a monopoly on it. They just happened to have the resources to do the heavy lift of making it viable at greater than 1 FPS.
 
Last edited:
There is nothing "sudden" about the driver team work schedule. They are grinding, and have been grinding. There is a sense of urgency there.

The simple fact of the matter is that Radeon has no where near the resources that NV does and not near as many games and configurations are tested, it is just that simple.

To be totally honest, I can't even believe AMD is still in the discrete GPU business at all any more.
Wouldn't shocked me if they call it quits in a couple more generations, we'll see how well this generation sells. It's a shame too, if people think Nvidia is greedy now, wait until their own competition is Intel or no one.
 
Not quite they used a voxel-based global illumination process which Nvidia again sort of wrote the book on, 5 years prior to the Neon Noir release.
https://on-demand.gputechconf.com/s...el-Based-Global-Illumination-Current-GPUs.pdf
You can find papers on everything somewhere. Not much commercialized tech in silicon that wasn't written about in some university a decade prior. Not sure how the Cry engine version compares to Epics. I have to assume like most people it will be Epics solution that is going to be the default for the next decade.
 
While I am not 100% sure on this, I would not hold your breath waiting for that to happen. A lot of talking heads saying talking heads things.
I'm just a speculator with no inside sources...
I would have to assume if AMD has a N31 re spin at this point, unless they are really short on stock. Its going to be a refresh part.
I would speculate that if the rumors of a "re spin" of the same part are true, they will wait till spring and release it as a 7950 xtx or something. Push the freq up enough to get close to around 10% more performance and just phase the 7900 out. I think it would be pretty similar to Rx 580->590 no die shrink in this case but basically be the identical design same CU count ect.
 
There is nothing "sudden" about the driver team work schedule. They are grinding, and have been grinding. There is a sense of urgency there.

The simple fact of the matter is that Radeon has no where near the resources that NV does and not near as many games and configurations are tested, it is just that simple.

To be totally honest, I can't even believe AMD is still in the discrete GPU business at all any more.
Yea. I am little surprised though that they haven't invested more there due to their budget research. May be more is invested on the CPU side since lot more revenue there. Hopefully with chiplet designs they finally get to reap more profits even if they are little slower and invest more back specially in GPU drivers. I do believe it seems they just focused on stability and popular games and rest are kinda all over the place lowering overall performance target.
 
Wouldn't shocked me if they call it quits in a couple more generations, we'll see how well this generation sells. It's a shame too, if people think Nvidia is greedy now, wait until their own competition is Intel or no one.
I doubt they ever call it quits... they like the console income on paper. They also know they need GPU compute elements for their data center APUs. They may however decide to once again not worry about flagship performance. If AMD is doing the heavy lifting design wise for consoles/embedded (like Tesla and KFC :)) it probably isn't a big deal outlay of cash to turn out mid range cards. They may however decide its not worth the effort to try and compete in the halo market.
 
I doubt they ever call it quits... they like the console income on paper. They also know they need GPU compute elements for their data center APUs. They may however decide to once again not worry about flagship performance. If AMD is doing the heavy lifting design wise for consoles/embedded (like Tesla and KFC :)) it probably isn't a big deal outlay of cash to turn out mid range cards. They may however decide its not worth the effort to try and compete in the halo market.
That's honestly fine, I don't even think there is a mid range market anymore really. I guess the reduced price 6800 XT and 6900 XT are the top end of mid range, but it really sucks there's not much value there anymore. With Raja heading Intel's graphics department I see them only being mid range as well. Letting Nvidia have the high end is totally fine if they can eat up some market share. Yes, you're right about consoles and data center.
 
  • Like
Reactions: ChadD
like this
Wouldn't shocked me if they call it quits in a couple more generations, we'll see how well this generation sells. It's a shame too, if people think Nvidia is greedy now, wait until their own competition is Intel or no one.
I honestly highly doubt that. They are finally at that point where they don't really have to care about significant market share but they probably increased their profits much more with Chiplet design with great yields.

7900xtx seems to be selling out everywhere so they do have a win at their hand and should be something they build on. I even think they can still drop price on these and have very healthy margins. They have landed the chiplet ship now they have to keep building on it.
 
Last edited:
I doubt they ever call it quits... they like the console income on paper. They also know they need GPU compute elements for their data center APUs. They may however decide to once again not worry about flagship performance. If AMD is doing the heavy lifting design wise for consoles/embedded (like Tesla and KFC :)) it probably isn't a big deal outlay of cash to turn out mid range cards. They may however decide its not worth the effort to try and compete in the halo market.
yea I agree. I don't think that means they won't have a high end part given how the chiplets where they are not making one huge chip anymore. I do believe they don't really care about being 15-20% slower than nvidia 1600 dollar part. Which is not a bad thing. If each gen you can come 15-20% slower and keep on improving on ray tracing piece they can afford to sell for cheaper with chiplet design given they don't really build on the bleeding edge node either.
 
  • Like
Reactions: ChadD
like this
While I am not 100% sure on this, I would not hold your breath waiting for that to happen. A lot of talking heads saying talking heads things.
The rumors about Navi 3+ aren’t exactly the flex that a lot of people on Reddit seem to think it is. You can either have a flawed product that is going to be phased out and fixed in a future hardware revision, or you can have future driver revisions that greatly improve performance, AMD doesn't have the resources to do both.
It would also indicate that they chose to appease stockholders and keep their release schedule intact rather than release a correctly functioning product.
 
And GPU sales still give AMD buying leverage with TSMC as well. AMD's business plan overall is very nuanced.
And keeping their GPU development active is what gives them access to the console market, let that slip, and it's a bad day for AMD as a whole.
I mean they could kill off their GPU division and put those extra resources into CPUs and probably not need to change their TSMC orders in any way at all, but it would mean giving up on some very lucrative Enterprise markets that would tank AMDs shareholder reputations.
AMD is more than happy playing second fiddle to Nvidia as long as their shareholders are content and they can maintain their brand.
IF (big if here), Intel were to surpass them for GPU sales and capabilities though there will be a shareholder riot.
 
The simple fact of the matter is that Radeon has no where near the resources that NV does and not near as many games and configurations are tested, it is just that simple.

To be totally honest, I can't even believe AMD is still in the discrete GPU business at all any more.
The budget and resource disparity between AMD/NV gets lost in winner-take-all GPU mindset. I'm simply baffled how AMD pulled off what they did.

If a 4090 gets 150FPS in Cyberpunk 4K (or whatever) with everything maxed, and a 7900XTX gets 142FPS - and you saved $600 for 8FPS less AND minimum frames above your monitor's capabilities? That's a massive win.

Also, AMD leaping forward two RT generations right onto NV's heels and where the best RTX card was only a few months ago, after Nvidia had a 3-4year headstart -- how? Equally mind boggling.
 
Last edited:
And keeping their GPU development active is what gives them access to the console market, let that slip, and it's a bad day for AMD as a whole.
I mean they could kill off their GPU division and put those extra resources into CPUs and probably not need to change their TSMC orders in any way at all, but it would mean giving up on some very lucrative Enterprise markets that would tank AMDs shareholder reputations.
AMD is more than happy playing second fiddle to Nvidia as long as their shareholders are content and they can maintain their brand.
IF (big if here), Intel were to surpass them for GPU sales and capabilities though there will be a shareholder riot.
"Diversity is our strength."
 
Wouldn't shocked me if they call it quits in a couple more generations, we'll see how well this generation sells.
A bit like Intel can they afford to do that ?

Becoming more Nvidia like and passing down data center GPU to game with less specialized effort to a gaming card maybe.

With how well that product are selling, the advantage of having a relation with foundry and a portfolio of product to adjust if Zen 4 does not sold like expected, a floor set by console APU, Laptop APU market.

Maybe chiplet crack open the sector in a couple more of generation and they are at the forefront.
 
AMD provides the CPU/APU/GPU components to all of the consoles anyone wants to buy and will do the same for the next gens coming probably in 2025 or 2026.......you see those sales numbers? AMD could abandon the discrete GPU market and still light their farts on fire with $10,000 bills.
 
Last edited:
.you see those sales numbers? AMD could abandon the discrete GPU market and still light their farts on fire with $1000 bills
Would like to see the margins to comments, how many $400 PS5 APU you need to sales to make the money of a single $10,000 Epyc or $1000 7900xtx.

A PS5 APU is 260mm2 of 6nm (it was 300mm of 7nm at launch), the 7950x are 2x70mm + 124mm for the IO die.

Say they achieve to charge $150 for it (which would leave 250-300 for 16gig of expensive ram, 1tb of nvme drive and so on), it is probably a lot of console apu you need to sales to make up for the sales of a single CPU/GPU.
 
Would like to see the margins to comments, how many $400 PS5 APU you need to sales to make the money of a single $10,000 Epyc or $1000 7900xtx.

A PS5 APU is 260mm2 of 6nm (it was 300mm of 7nm at launch), the 7950x are 2x70mm + 124mm for the IO die.

Say they achieve to charge $150 for it (which would leave 250-300 for 16gig of expensive ram, 1tb of nvme drive and so on), it is probably a lot of console apu you need to sales to make up for the sales of a single CPU/GPU.
Yes and no? AMD sells the APU and Memory as a package they don’t make a lot but it’s guaranteed and it keeps them in the game. They sell their GPU’s to the AIBs in the same configuration and they may make more but not a lot more. The consoles also get cheaper every year, it may only break even in year one but it turns an increasing profit each subsequent year as the nodes get cheaper and the yields improve.
But in AMD's case, they can’t stay relevant without the consoles in most consumer spaces.
IBM was happy to be out of that space which would leave the last 2 generations to Nvidia. Imagine where all games were built with DLSS, RTX, CUDA, and GSync and none of the open alternatives. Where would AMD be competing there? NVidia makes some good ARM CPUs as well and yeah AMD would still have the Ryzen CPU but without their recognition from the consoles for most people they would still be “not Intel”.
The consoles for AMD are a long-term investment in keeping Nvidia and Intel from running away from them due to their R&D budgets.
R&D for a console you can work that cost in as a 5-year return, for GPUs, CPUs, and their EPYCs and such they have to take all those R&D costs and work that down to a 9 -18 month cycle instead, it costs a lot to remain relevant there, there are also much larger demands for drivers, support, and various other tasks that require people on deck that cut into those margins a lot more than you think. This is why AMD loves open source, not because they believe in any of that stuff but because it greatly cuts down on their costs because volunteers are doing a lot of their work for them.
 
Last edited:
Pretty cool, that ASUS TUF 7900 XTX is really something else especially when OC'ed?

 
Yea I don't get this. either way if you are playing heavy RT game at 4k you have no choice but to use upscaling then doesn't matter if its 7900xtx 4080 or 4090. Sure 4090 is nicer but it becomes playable with other cards too. To my eyes It don't matter so I just turn off ray tracing and I don't feel like I am missing out in the game lmao.
Uh, sure it does. First off, if I had to use upscaling on AMD I couldn't use DLSS. It would have to be FSR, which simply doesn't look as good at the same resolution. Finally, DLSS Quality mode basically has no IQ loss. If I had to drop to balanced or performance mode depending on the card I had, each setting looks substantially worse over native/quality mode.

Depending on the game raytracing makes a pretty huge difference. DL2 without global illumination looks like shit without it, as an example.
 
Uh, sure it does. First off, if I had to use upscaling on AMD I couldn't use DLSS. It would have to be FSR, which simply doesn't look as good at the same resolution. Finally, DLSS Quality mode basically has no IQ loss. If I had to drop to balanced or performance mode depending on the card I had, each setting looks substantially worse over native/quality mode.

Depending on the game raytracing makes a pretty huge difference. DL2 without global illumination looks like shit without it, as an example.
FSR2 is pretty solid. IDK how people tell difference without looking at stills to be honest. Sure may be in cyberpunk I could tell, but I saw video on youtube they did a test among themselves and they were more wrong then right in telling when ray tracing was on in most game cuz it wasn't that obvious lmao. I am the same way.
 
And GPU sales still give AMD buying leverage with TSMC as well. AMD's business plan overall is very nuanced.
That is a good point. I imagine chiplet on everything now also means AMD is a prime customer for all the 1-2 nodes back TMSC wants to sell.
 
AMD provides the CPU/APU/GPU components to all of the consoles anyone wants to buy and will do the same for the next gens coming probably in 2025 or 2026.......you see those sales numbers? AMD could abandon the discrete GPU market and still light their farts on fire with $10,000 bills.
I'm not sure they are really making big money on that. The margins on the MS/Sony parts are not insanely high. I think the console business is just more about big volume. Shareholders love big volume.
 
I run lawn business and tell you, you will lose in company if your stupid or dumb, its very easy make money on business if you know what your doing.. running business how you see fit is personal choice and it all comes down to CEO

Company will not sell at loss of profit, unless the loss is not high enough to go in red. if anything they will make some money or break even, you lose money if you waste money unwisely
 
View attachment 534117

https://www.hwupgrade.it/articoli/s...sione-geforce-rtx-4080-accerchiata_index.html

Not much sample size I could find (or game to test to start with I think, could be nice to see some of the demo tested but that I can imagine being misleading)

the xtx is more 40-45% higher than the 6950xt in those quite extreme workload type than 30-35%, at least in that single example.
This looks like AMD's brand new flagship card is about on par with Nvidia's previous generation 3rd-down-from-flagship card (the 3080). I'm looking at 4K, of course. At 1080p, the tight grouping makes it seem that there are other items in the system which are holding back the GPUs.

If UE5 is the most hardware-agnostic and most pervasive engine, then this really does not look good for AMD going forward. They're going to have to pull an awfully large rabbit out of an awfully small hat.

That means raster is king and will still be king for a minimum of 3 more GPU generations.
This statement is going to age like fine milk.

This is what a lot of people are missing. The lighting engine in UE5 is going to be a HUGE deal and is very likely to "strike a blow" to RTX going forward. UE5 is going to simply dominate the PC/Console game engine world going forward most likely.
It's still very early, but the plot that Luke posted seems to show the RTX hardware running UE5 appreciably faster than AMD.
 
Like with basically every AMD product from Fiji onwards, the problem with this card is that it undercuts the competing card from Nvidia, but not by enough to justify the reduced feature set, buggier drivers, lower edge case performance, and generally worse user experience. The $1000+ super-premium graphics card segment isn't exactly price sensitive either, so the fact that the card is $300 cheaper is less likely to matter.

Right now it doesn't matter, because there are only four cards at this performance tier and they're all sold out, but if/when both companies ramp their volumes (really, when TMSC ramps N5) I think there are going to be price shifts. Looking at it this way:
  • GCD wafers cost somewhere around $15K; you can fit ~200 GCD per wafer and if we assume 80% yields (sadly TSMC's yields are an incredibly tightly kept secret) we're looking at roughly $100 per bare die, plus $30-40 or so of MCDs. If we ballpark double that for test/package we're looking at probably $250 per packaged chip from AMD.
  • GDDR6 is something like $15/GB in low quantities, call it $10/GB if you are buying it by the pallet, so the 24GB of memory costs the AIB $240.
  • Boards are cheap, it's probably $50 for the PCB + assembly.
  • If we budget $100 for power delivery + passives, we get a total cost of $650 + AMD margins to build a 7900 XTX.
So there is room to go down - we know AIB margins are shit so AMD is making a lot off each packaged chip right now, but once N5 and GDDR6 prices go down there's definitely room for a flagship tier part at the 3080's old price of ~$700. Regardless, one thing's for sure: the 4080 and 4090 are priced way higher than they "need" to be (in the sense that you could be paying $999 for a 4090 and everyone in the chain would be making pretty standard profits).
 
  • Like
Reactions: DPI
like this
FYI There was just a drop of the 7900 XTX on AMD.com and I got one. If you're looking for one, they are in stock.
I would rather get an AIB with an unlocked Bios. That way you can get way more performance then a reference model.

Red Devil or Nitro+ would be the ones to look for (Not sure is Asrock or XFX has an unlocked version yet)
 
I would rather get an AIB with an unlocked Bios. That way you can get way more performance then a reference model.

Red Devil or Nitro+ would be the ones to look for (Not sure is Asrock or XFX has an unlocked version yet)
I wanted the reference model because it's 11.3 Inches long. That will fit in any of my other home systems after the card hit's its expiration date. My 6900XT is an after market monster from MSI and it will languish on a shelf until I either sell it for next to nothing or find a friend with a case big enough to put a 14+" card in it... Plus paying an extra 100-500 bucks for an overclocked version of the same card never really sat well with me. Might as well just get yourself a 4090 at that point.
 
I wanted the reference model because it's 11.3 Inches long. That will fit in any of my other home systems after the card hit's its expiration date. My 6900XT is an after market monster from MSI and it will languish on a shelf until I either sell it for next to nothing or find a friend with a case big enough to put a 14+" card in it... Plus paying an extra 100-500 bucks for an overclocked version of the same card never really sat well with me. Might as well just get yourself a 4090 at that point.
Well the red devil is only $100 more. And if it can overclock to get very close to 4090 performance thats a win in my book
 
Well the red devil is only $100 more. And if it can overclock to get very close to 4090 performance thats a win in my book
IF the ASUS TUF can do it, the Red Devil likely can too. You can probably even do it on the reference card for that matter. Just keep it in a well ventilated case or drop a water block on it after market.

I don't like leaving performance on the table, but from a practical standpoint, the inability to move my 6900XT over to one of my older PCs almost feels criminal to me. Given I paid a grand for that one too. So, there is a method to my madness this time.
 
IF the ASUS TUF can do it, the Red Devil likely can too. You can probably even do it on the reference card for that matter. Just keep it in a well ventilated case or drop a water block on it after market.

I don't like leaving performance on the table, but from a practical standpoint, the inability to move my 6900XT over to one of my older PCs almost feels criminal to me. Given I paid a grand for that one too. So, there is a method to my madness this time.
You cannot as you are limited by power. Only 2x8pin on the reference cards.

Seeing all these overclocking results. It is now clear AMD could of easily went toe to toe with the 4090 is they used the same power and cooling like Nvidia did.

Glad to see AIB's having the ability to do that it seems.
 
Just like to add that the Sapphire Nitro+ is only $1100 as well. Seeing as once it is overclocked, it can almost match a 4090. Thats a pretty good $500 in savings!

Now the real question is. Can you find one lol
 
You cannot as you are limited by power. Only 2x8pin on the reference cards.

Seeing all these overclocking results. It is now clear AMD could of easily went toe to toe with the 4090 is they used the same power and cooling like Nvidia did.

Glad to see AIB's having the ability to do that it seems.
AMD stated that they left a ton of wiggle room in the architecture and they said that almost six months ago. I suspect I can squeeze some more performance out if I really want to.

The only way you get 4090 performance is by buying a 4090. It can likely be overclocked as well... If you're talking matching raster performance, sure, maybe.

I'm just happy I didn't pay over MSRP this time & I can re-use the card when I drop in the next best thing in a year or two.

It all comes down to personal use case. You do you.
 
AMD stated that they left a ton of wiggle room in the architecture and they said that almost six months ago. I suspect I can squeeze some more performance out if I really want to.

The only way you get 4090 performance is by buying a 4090. It can likely be overclocked as well... If you're talking matching raster performance, sure, maybe.

I'm just happy I didn't pay over MSRP this time & I can re-use the card when I drop in the next best thing in a year or two.

It all comes down to personal use case. You do you.
overclocking a 4090 might get you 4-5% more performance while hitting 500w+

So far the good AIB 7900xtx's are showing around 21% performance right now overclocking. While using around 420w+.
 
I like the reference cards for the size and the 2x8 pin power connectors because they fit into all of my SFF cases and my existing sfx PSUs aren’t an issue.

However!

That Red Devil card looks kind of awesome despite being a chonker. Same for the Sapphire Nitro. If I went Aib the Nitro is number one on my wish list.
 
Well the red devil is only $100 more. And if it can overclock to get very close to 4090 performance thats a win in my book
There will almost certainly be an unlocked bios for the reference card. I just got mine and it's probably the heaviest card I've ever owned, surprisingly so. Noticably heavier than the other hefty card I recall having, a 2080 ti Aorus.
 
There will almost certainly be an unlocked bios for the reference card. I just got mine and it's probably the heaviest card I've ever owned, surprisingly so. Noticably heavier than the other hefty card I recall having, a 2080 ti Aorus.
Well lets be honest. Unlocked bios on a reference card won't help all that much since you will be limited by the 2x8pins.
 
Back
Top