AMD Ryzen 9 7950X3D CPU Review & Benchmarks: $700 Gaming Flagship

Some of us were out here reminding people AMD was just like any other corporation, so why this would be surprising to anyone is a mystery.

AMD has worked very hard to cultivate an image of being the more reasonable friendly CPU company, but it is all a farce.

When they have been cheaper it has been because they had no other options but to compete on price. When they had the lead back in the early 2000's, they sold top end single and dual core Athlon 64's for over a thousand bucks, a price that was unheard of back then.

Just a few years earlier I had bought the fastest CPU money could buy for about $300.

The priciest from my recollection was the single core Athlon 64 FX-57, FX-60 and FX-62 which all had an an MSRP of $1031 but the dual core Athlon 64 X2 4800+ wasn't far behind at $1001.

Corrrected for inflation from mid 2005 to early 2006, this is ~$1,600 today.

If you think AMD is the enthusiasts friend, just look at 2005 CPU pricing to realize they certainly are not now and never were. They are a company like any other, and when they can, they will charge for their products.

Their "user friendly" open source approaches to Free-Sync, HBM and the like are really just done out of necessity.

All corporations are in it to maximize profits. AMD is no different.
 
Last edited:
Though $600 for a 7900X, mem, and board is unbeatable
Yup. Crazy value, never regret grabbing it. If the 7800x3d turns out great, I'd have no issue with selling the 7900x for one since that combo let me get into AM5 so easily.
 
so AMD is no longer trying to undercut Intel on pricing and is now going crazy with their high end CPU pricing
...why on earth should they? There isn't a single thing that Intel is good at right now that isn't achieved by brute force TDP breaking.

AMD has the better architecture, the better manufacturing, the better scalability, the better thermal performance... by far.

And it's true on desktop, mobile and datacenter.
 
...why on earth should they? There isn't a single thing that Intel is good at right now that isn't achieved by brute force TDP breaking.

AMD has the better architecture, the better manufacturing, the better scalability, the better thermal performance... by far.

And it's true on desktop, mobile and datacenter.
They are so close in performance by nearly every metric, so I would suggest putting down whatever AMD drink you are currently enjoying, because it is not doing you any favors.
 
...why on earth should they? There isn't a single thing that Intel is good at right now that isn't achieved by brute force TDP breaking.

AMD has the better architecture, the better manufacturing, the better scalability, the better thermal performance... by far.

And it's true on desktop, mobile and datacenter.
The big thing they don’t have is their own fab - like Intel. Let’s hope they have their orders locked in. Which also may explain the pricing…
 
The big thing they don’t have is their own fab - like Intel. Let’s hope they have their orders locked in. Which also may explain the pricing…
I was actually just about to edit that into my post; in addition to having the lead on basically everything, they also have to pay somebody else to fab for them.

If anybody should be undercutting to make a living right now it's Intel.
 
They are so close in performance by nearly every metric, so I would suggest putting down whatever AMD drink you are currently enjoying, because it is not doing you any favors.

They are very similar in top end performance, in actual in game power usage, intel is anywhere from 2 to 3 times higher.

This suggests that AMD has more room for improvement, whereas Intel is bouncing off the ceiling of their capability right now.
That and the AMD chip will be cheaper in operation (less power use) and cooler and quieter at load.

I'm no fanboy. I've owned CPU's from Intel, AMD and Cyrix based on what the best bang for the buck was when I was buying. Right now AMD does appear to have the edge on CPU's though. Who knows how this will look next gen as Intel's newer processes continue to come online.

Right now, at this particular moment Intel is doing what they are doing only by very stringent binning and pushing the voltage curves outside of their optimal range, so the perf/watt goes in the toilet.

If you don't care about power and noise, that is fine I guess. Everyones priorities are different.
 
Last edited:
Ah. The Nvidia business model. Its proven to work on the masses so why not imitate it?

It works in every industry with every product. When you have a class leading product you can charge more for it. The prupsoe of corporations is to earn returns for their shareholders, so whenever they can charge more, they will.
 
They are very similar in top end performance, in actual in game power usage, intel is anywhere from 2 to 4 times higher.

This suggests that AMD has more room for improvement, whereas Intel is bouncing off the veiling of their capability right now.
That and the AMD chip will be cheaper in operation (less power use) and cooler and quieter at load.

I'm no fanboy. I've owned CPU's from Intel, AMD and Cyrix based on what the best bang for the buck was when I was buying. Right now AMD does appear to have the edge on CPU's though. Who knows how this will look next gen as Intel's newer processes continue to come online.

Right now, at this particular moment Intel is doing what they are doing only by very stringent binning and pushing the voltage curves outside of their optimal range, so the perf/watt goes in the toilet.

If you don't care about power and noise, that is fine I guess. Everyones priorities are different.
And I'm sure ZeroBarrier was a big proponent of the FX-9370 efforts during the Bulldozer days. 😉😉😉

At least the 9370 was wasting everyone's time at $200 and not at $600 like these 13900K/Ss are...
 
But why do people care about 60w differences? That's what a single mid range incandescent light bulb uses in an HOUR.

Most people care enough that they no longer have any incandescent bulbs. I don't think I've seen one in actual use around here in several years.

It's not that just one 60w bulb uses a ton of power (though if you do leave them on 24/7 for a month, at 25c/KWH which is what the power costs here in the winter, it adds up to $11 per month.

If you have a bunch of light bulbs in your house, pretty soon it adds up to real money, and at the price the LED bulbs cost now, they can pay for themselves in a month, so there is no reason not to.

Same with a PC, it adds up.

That, and higher power use means more heat, and more heat means more annoying fan noise to keep it cool. Who doesn't like gaming in silence? :p

That said, it is true. With this generation of GPU's, 60 watts one way or another is going to ahve a relatively small impact compared to that 450W GPU.
 
I rather think that $530 for the 13900K is underpriced and that $699 for the AMD is about right. (also based on what the FX-55 and friends did cost)
 
And I'm sure ZeroBarrier was a big proponent of the FX-9370 efforts during the Bulldozer days. 😉😉😉

At least the 9370 was wasting everyone's time at $200 and not at $600 like these 13900K/Ss are...

Exactly. The 13900KS is a slightly more competitive FX-9590. You hit it spot on.

Same tricks being used to try to compete, and doing that only works short term. You have to come up with an actual improvement.
 
Last edited:
so AMD is no longer trying to undercut Intel on pricing and is now going crazy with their high end CPU pricing
Unlesss you been in a cave you should realize they priced it same as 7950x for the 7960x3d and have dropped the 7950x since then. So they have more chips then Intel does right now. They have something for all price tiers. Don’t just look at the highest end version Lmao.
 
Unlesss you been in a cave you should realize they priced it same as 7950x for the 7960x3d and have dropped the 7950x since then. So they have more chips then Intel does right now. They have something for all price tiers. Don’t just look at the highest end version Lmao.
Yeah - the fire sales of the past several months had nothing to do with Zen 4 underperforming sales wise... :ROFLMAO:
 
Yeah - the fire sales of the past several months had nothing to do with Zen 4 underperforming sales wise... :ROFLMAO:
I mean people Just see prices of latest parts. If anyone expected x3d 16 core to sell less that original 7950x they just had horrible expectations Lmao. That’s all I am saying. We got cheaper parts and different tiers to choose from. Fact that they dropped the highest end x3d at 699.99 was a surprise. Everyone was expecting 799.99
 
Yeah - the fire sales of the past several months had nothing to do with Zen 4 underperforming sales wise... :ROFLMAO:

Yeah, AM5 has been a tough sell, coming out as it has during a period when people are concerned about their finances, and requiring a new motherboard, CPU and RAM for those who go for it.

The CPU's themselves are good, and the pricing isn't too bad, but motherboard pricing has been kind of nuts, as has DDR5 (at least in the beginning, I haven't looked lately)

Intel was wise to allow for DDR4 fallback as it reduces the barrier to entry for those who are upgrading.
 
Yeah, AM5 has been a tough sell, coming out as it has during a period when people are concerned about their finances, and requiring a new motherboard, CPU and RAM for those who go for it.

The CPU's themselves are good, and the pricing isn't too bad, but motherboard pricing has been kind of nuts, as has DDR5 (at least in the beginning, I haven't looked lately)

Intel was wise to allow for DDR4 fallback as it reduces the barrier to entry for those who are upgrading.
Yeah. That is what saved Intel - DDR4. Otherwise, feature for feature - Z790 boards are no cheaper than X670E.
 
Love that power usage while gaming, with solid performance too. It's exactly what I want in my rig.

Guess I'm picking up the 7800X3D day one.
I'm in the same boat, just between the 7800x3D/7900x3D still. This is exactly what I need to justify the overhaul. My old 7700k is starting to get long in the tooth, and I was originally eyeballing either the 7700x or the 7800x (7700x would have been kinda funny, just swapping the last letter and all), but after seeing the 5800x3D and hearing the same was coming to the 7-series *soon*, I'm glad I've continued to hold out.

It's finally time to do the great "downward shuffle" of PC components from my machine to the wife's and all the other random PC's strewn about the house come April.

Honestly, I'll be sticking with my 5800x3d it seems. Thought this would prove to show better gains, but it really doesn't.

I absolutely would hold firm too. Goes to show how strong the 5800x3D was/still is!
 
Last edited:
Paul's Hardware benched Flight Sim. Includes 1% lows. didn't see that title from the usual reviewers.
 
I'm the one drinking? 😂 Okay...
Put the drink down and do a little digging, but here, let me spoon feed you:
https://www.anandtech.com/show/17641/lighter-touch-cpu-power-scaling-13900k-7950x/3
They are very similar in top end performance, in actual in game power usage, intel is anywhere from 2 to 3 times higher.

This suggests that AMD has more room for improvement, whereas Intel is bouncing off the ceiling of their capability right now.
That and the AMD chip will be cheaper in operation (less power use) and cooler and quieter at load.

I'm no fanboy. I've owned CPU's from Intel, AMD and Cyrix based on what the best bang for the buck was when I was buying. Right now AMD does appear to have the edge on CPU's though. Who knows how this will look next gen as Intel's newer processes continue to come online.

Right now, at this particular moment Intel is doing what they are doing only by very stringent binning and pushing the voltage curves outside of their optimal range, so the perf/watt goes in the toilet.

If you don't care about power and noise, that is fine I guess. Everyones priorities are different.
Scaling the power on the Intel CPU while keeping nearly identical performance just tells us Intel still sucks at power scaling their own CPUs. See the above link.
And I'm sure ZeroBarrier was a big proponent of the FX-9370 efforts during the Bulldozer days. 😉😉😉

At least the 9370 was wasting everyone's time at $200 and not at $600 like these 13900K/Ss are...
Bulldozer wasn't worth the sand that was used to make the silicon.
 
Scaling the power on the Intel CPU while keeping nearly identical performance just tells us Intel still sucks at power scaling their own CPUs. See the above link.

It's pretty typical that a given combination of architecture and fab process results in a chip that has a voltage or power sweet spot where it gets the best perf/watt, and as you move outside of that sweet spot either on the low side or on the high side, the perf/watt suffers.

This is why I never liked it when they use the same design for both mobile and desktop parts. For mobile power is a top proiority, so you'd really want an architecture that targets that envelope in a way that allows mobile chips to maximize perf/watt in the low power range. For the desktop - however - while we don't completely not care about power, it is a much smaller concern, and as such we could benefit from an arch that maximizes performance at high power situations.

In the 90's mobile CPU's were just scaled down desktop CPU's, but at some point this changed, and mobile became the priority, with chips primarily designed for low power optimization, and then attempted to be scaled up for the desktop, resulting in many compromises.

I argue we should really have special designs for each, but then you are doubling your development costs, which no business likes to do, so as long as they don't have a competitor eating their lunch, they are going to compromize and have a chip that is half-assed on one, or the other, or both.
 
Eh, my fx6300 worked just fine and was only $90. Lasted me all the way from my Q6600 to my Ryzen 1600.
Intel parts of the time also worked just fine until Ryzen was released, the difference is those CPUs were at least worth the sand used to produce them.
 
...why on earth should they? There isn't a single thing that Intel is good at right now that isn't achieved by brute force TDP breaking.

AMD has the better architecture, the better manufacturing, the better scalability, the better thermal performance... by far.

And it's true on desktop, mobile and datacenter.

Very true.

If your good at something... never do it for free. Intel is behind in every market... why should AMD just accept being the value product forever.
 
Honestly, I'll be sticking with my 5800x3d it seems. Thought this would prove to show better gains, but it really doesn't.

Think the advantage on this specific part is clear. You get the game goodness without having to sacrifice production workloads. This specific chip seems to be the best of both. Reviews of the 78003D will be more interesting. I'm not expecting massive improvements over the 5800 though... I mean the star here is the cache not the arch. For gaming the archs specifically aren't making a big deal. The 78003D will probably score only slightly higher or = to this part in gaming... but it won't be any different then a standard 7800 in production work.

The 7950X3D is the mullet of CPUs. Business in the front, party in the back.
 
It's pretty typical that a given combination of architecture and fab process results in a chip that has a voltage or power sweet spot where it gets the best perf/watt, and as you move outside of that sweet spot either on the low side or on the high side, the perf/watt suffers.

This is why I never liked it when they use the same design for both mobile and desktop parts. For mobile power is a top proiority, so you'd really want an architecture that targets that envelope in a way that allows mobile chips to maximize perf/watt in the low power range. For the desktop - however - while we don't completely not care about power, it is a much smaller concern, and as such we could benefit from an arch that maximizes performance at high power situations.

In the 90's mobile CPU's were just scaled down desktop CPU's, but at some point this changed, and mobile became the priority, with chips primarily designed for low power optimization, and then attempted to be scaled up for the desktop, resulting in many compromises.

I argue we should really have special designs for each, but then you are doubling your development costs, which no business likes to do, so as long as they don't have a competitor eating their lunch, they are going to compromize and have a chip that is half-assed on one, or the other, or both.
What's most interesting to me is how the AMD CPU when scaled to a certain W is somehow always hotter than the intel CPU at the same W. I wonder what's going on there.
 
What's most interesting to me is how the AMD CPU when scaled to a certain W is somehow always hotter than the intel CPU at the same W. I wonder what's going on there.
Don't we already know that the Zen4 chips have an extra thick IHS to maintain compatibility with AM4 coolers? Delidding, or even shaving off a mm or so, takes a huge chunk of temp off.
 
IDK what you mean. AMD blows them way in efficiency so not sure what you mean by what he is drinking lmao.
He has no idea what he's talking about. And the closest thing he can up with to a retort is to reference some power scaling numbers from the 45w-higher-TDP 7950X.

AMD could choose to juice the 7950X3D with another 100w of maximum headroom, take away every marginal win Intel currently holds by doing so, and still come in with a lower total power draw.
 
Don't we already know that the Zen4 chips have an extra thick IHS to maintain compatibility with AM4 coolers?
We don´t, and I doubt that it is true given that literally no one complained about the requirement to spend a few bucks to get a new mounting bracket for their existing coolers back when am4 was introduced. Apart from that, I imagine that increasing socket thickness would be the more obvious solution for that problem.

I find it far more likely that the extra vertical space's primary reason is the cache in the 3d versions of the cpu. If I remember correctly, the previous gen 5800x3d required quite a bit of effort to obtain thinner chiplets so that they could add the cache on top within the existing am4 spec.
 
Just skimming the TPU review.

borderlands-3-2560-1440.png



BL 3 is very CPU intensive for some reason, but to see it get almost double the 5800X is impressive.
 
Back
Top