AMD To Unveil Next Major Products At Gamescom 2023, Radeon RX 7800 XT & 7700 XT Expected

He isn't wrong though. AMD is decent, just not as good (overall). The problem is they are seen as a slightly inferior product with a slightly lower price. Their software/features are generally a generation behind, and their product releases generally lag as well. The 4070/4060ti have been out for a few months now. AMD will be what, 5-6 months late to the party? Even if people wanted to give AMD a chance, Nvidia was the only game in town with exception of last generation products. 5-6 months is a long time in the tech world so even if people were unhappy with Nvidia's $400-600 offerings, a lot of gamers already bought them.

If the products were on time and maybe priced a bit lower than they currently are we'd likely see some shift.

DLSS Frame Gen in Ratchet & Clank is was fairly impressive for me. I didn't think I noticed much latency issues, and for slower paced single player games it is likely fine. That is an example of something AMD will need to counter.
Again, the numbers speak for themselves. These are excuses. They aren’t good enough for anyone to actually buy.

That’s the bottom line.
 
I don't care much about Navi32, but I'm hoping for an actual release date for FSR3. I'd be surprised if they didn't launch FSR3 next week, if only because they're really gonna need Frame Gen tech to make the 7800XT look any good vs 6800XT given what we know about it's leaked/rumored specs...
Unless AMD drops the Open Source the 6800XT will get the same "magic" ya?
 
Unless AMD drops the Open Source the 6800XT will get the same "magic" ya?
Open-Source means it's software-agnostic, but idk if it necessarily means it's hardware agnostic. It would certainly be nice if it is (and I hope that's the case, both for the sake of gamers and for FSR3 adoption rate which would further help gamers) but my line of thought is maybe FSR3 Fluid Frames will utilize the "AI" cores that got added in RDNA3 to speed things up enough to make the framegen actually perform and look good enough to be worth using.

We'll hopefully find out next week.
 
Last edited:
Price performance features. Does it do what you want satisfactory. In my case there is absolutely nothing below the 4090 that is worth it, joke products. I hope AMD does not follow suite below the 7900 series which are pretty good in my experience. Looks like the 7700 and 7800 XT have a very key feature and that is sufficient VRam. Matter of price now.

I don’t see AMD selling products that compete well for a significantly cheaper price. Making them the cheap brand. If they ever do smash Nvidia in performance I expect them to then really try to get market share.
 
Open-Source means it's software-agnostic, but idk if it necessarily means it's hardware agnostic. It would certainly be nice if it is (and I hope that's the case, both for the sake of gamers and for FSR3 adoption rate which would further help gamers) but my line of thought is maybe FSR3 Fluid Frames will utilize the "AI" cores that got added in RDNA3 to speed things up enough to make the framegen actually perform and look good enough to be worth using.

We'll hopefully find out next week.
Dynamic resolution, using mouse, keyboard controller inputs to work with the frame generation for motion like is done in VR. Maybe minimum frame rate switch option that turns on FSR 3 frame generation, work with Chill for utterly consistent frame rates and frame times. So much AMD can do to make it more useful. The only thing I can see using it for would be Flight Similator in VR, if the generated frames are good enough that is.
 
Longest list of excuses ever. You’re worse than an F1 driver.
Try to point out what you don't agree with.
Maybe (I am not sure for either, Baldur games 3 is a really big hit and DLAA can be more stable than TAA in movement, libraries of title where you have room to spare even at the native resolution of monitor will continue to just growth and growth over time), but media covering that space did speak like latency was a big deal for frame generation and Reflex keeping it lower than otherwise was important in that regard.
Reflex is just trying to correct a problem created by DLSS/DLAA. Nvidia's Reflex and AMD's Anti-Lag are placebo's as far as I'm concerned. The best way to reduce input lag is just limit your frame rate, and of course don't use upscalers like DLSS or FSR.
View attachment 591241View attachment 591242
In some case DLSS3 had even less latency than DLSS2 without reflex on, all of a sudden it became useful outside the competitive gaming crowd and even if the people does not know it exist and that it is on.
Part of that is due to not running the game at true 4K. That and DLSS3 creates fake frames. Not long ago people had a problem with using AI to increase the frame rate of movies but somehow DLSS3 doing it is totally OK. I'm not entirely sure I'm Ok with the idea of creating fake frames to give me the illusion of real frames, since I haven't seen it in action.

View: https://youtu.be/_KRb_qV9P4g
 
Dynamic resolution, using mouse, keyboard controller inputs to work with the frame generation for motion like is done in VR
Do you need to have a resolution much bigger than the screen, to take the example above the only way to know there would be a windows to the right in advance was if the render was much bigger than the viewpoint windows:

d0655e9715c946afc581316746bc61625c1165ee.jpg


For something like VR when it is a obligation to not get sick certainly worth it to render larger to have less 'real' FPS and how much larger you need to render for natural head movement speed versus mouse movement speed could be different.

And maybe that VR naturally do this, individual eye viewport being larger than the combined one, giving them a lot of cut on the floor footage they can use.

Do we have some idea of how much larger of a render you would need to support fast mouse movement ?
 
Reflex is just trying to correct a problem created by DLSS/DLAA. Nvidia's Reflex and AMD's Anti-Lag are placebo's as far as I'm concerned. The best way to reduce input lag is just limit your frame rate, and of course don't use upscalers like DLSS or FSR.
Not sure I follow you at all here, DLSS or FSR reduce input lag a lot versus native (like most thing that augment frame rate). Not sure what you mean by placebo effect, in fast passed multiplayer shooter reflex can reduce latency by 33% and more:

Overwatch-2-Traning-Latency-3440x1440p-DX11-Ultra.png


Part of that is due to not running the game at true 4K.
In both the compared case ,DLSS2 and DLSS3 are not running the game at true 4K so it has nothing to do with that, I do not say lesser than native without reflex on.
 
Last edited:
Try to point out what you don't agree with.
AMD is a fundamentally broken business.
No one wants an AMD card regardless of price. They could give away 7900XTX cards away for free and people would still refuse them and would rather pay $1600 for 4060 Tis.

Spending big bricks of text to make statements to the contrary doesn’t change their undesirability.
 
AMD is a fundamentally broken business.
No one wants an AMD card regardless of price. They could give away 7900XTX cards away for free and people would still refuse them and would rather pay $1600 for 4060 Tis.
This feel like trolling at this point

https://www.newegg.com/d/Best-Sellers/GPUs-Video-Graphics-Cards/s/ID-48
https://www.newegg.ca/d/Best-Sellers/GPUs-Video-Graphics-Cards/s/ID-48

Best selling GPU on newegg.ca is an AMD gpu, 4 of the top 8 skus and a 6800xt being cheaper than a 4070 being a big reason why, 5 in the USA top 8.

Virtually everyone have a price for which they wants a AMD card, and you obviously know that, so this hyperbolic is a way to say in a expressive way: a lot of people are ready to pay a bit of an extra for the greenbox, including OEM, companies and builders ? Sure...

Look on amazon best sellers:
https://www.amazon.com/Best-Sellers-Computer-Graphics-Cards/zgbs/pc/284822
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
This feel like trolling at this point

https://www.newegg.com/d/Best-Sellers/GPUs-Video-Graphics-Cards/s/ID-48
https://www.newegg.ca/d/Best-Sellers/GPUs-Video-Graphics-Cards/s/ID-48

Best selling GPU on newegg.ca is an AMD gpu, 4 of the top 8 skus and a 6800xt being cheaper than a 4070 being a big reason why, 5 in the USA top 8.

Virtually everyone have a price for which they wants a AMD card, and you obviously know that, so this hyperbolic is a way to say in a expressive way: a lot of people are ready to pay a bit of an extra for the greenbox, including OEM, companies and builders ? Sure...

Look on amazon best sellers:
https://www.amazon.com/Best-Sellers-Computer-Graphics-Cards/zgbs/pc/284822
This is just a single data point. We already know >80% of the business is Nvidia’s. That isn’t in question.

Cost is absolutely not a determining factor. We’ve been over that in at least 3 other threads already.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
This is just a single data point. We already know >80% of the business is Nvidia’s. That isn’t in question.
And not over 90% because a lot of people buy AMDs.

Cost is absolutely not a determining factor. We’ve been over that in at least 3 other threads already.
That cannot be true, why would AMD sell their gpus at a smaller price than Nvidia if it is not a factor, why do we see the price goes down over time after launch ? You think that AIB-Amd-reseller are that stupid, loosing money on every sell instead of selling the 6800xt the same price than the 3080, the 7600 the price of a 4060, the 7900xtx the price of the 4080, that they would not lose a single sale and just make more money ?

Again, this cannot be something you really mean.
 
And not over 90% because a lot of people buy AMDs.
<20% is not “a lot of people” comparatively. If AMD didn’t count integrated, it would be far lower than that.
That cannot be true, why would AMD sell their gpus at a smaller price than Nvidia if it is not a factor, why do we see the price goes down over time after launch ? You think that AIB-Amd-reseller are that stupid, loosing money on every sell instead of selling the 6800xt the same price than the 3080, the 7600 the price of a 4060, the 7900xtx the price of the 4080, that they would not lose a single sale and just make more money ?
People buy nVidia for their features (better RT, DLSS, Reflex, CUDA, etc), lower power per watt, better driver stack, etc.

No AMD card can compete in the space. If I need CUDA, I need CUDA. A free 7900XTX then literally does not fulfill my needs as an end user. And that thinking process continues over all the rest of the feature stack.

AMD is a cut rate bargain brand. Nobody buys them because they want to. Everyone wants Nvidia’s features whether they are willing to pay for them or not. And clearly with their market dominance, most people would have a lower performing nVidia card for more than a greater performing AMD card for less. (Let’s get those 1050Ti’s boys!)

Again, this cannot be something you really mean.
I have nothing to do with this. The market has spoken.
 
<20% is not “a lot of people” comparatively. If AMD didn’t count integrated, it would be far lower than that.
Those high nvidia market share figures does not count integrated, number we see are usually dGPU, when you see gpu sales Intel tend to sales the most and AMD sometime tie with Nvidia.

Last year it was intel at 60-70%, nvidia 17%-20%, AMD 12%-19% if we talk PC gpus in general.
https://www.statista.com/statistics/754557/worldwide-gpu-shipments-market-share-by-vendor/

I have nothing to do with this. The market has spoken.
you think, AMD sales the exact same number of 7600 if they sales if $300 instead of $270, you know that for a fact but they do not know it and they throw money for every sales they make, same for all their models currently they could all be significantly more expensive, they would not loose a sales ? That what you really mean ?
 
Those high nvidia market share figures does not count integrated, number we see are usually dGPU, when you see gpu sales Intel tend to sales the most and AMD sometime tie with Nvidia.

Last year it was intel at 60-70%, nvidia 17%-20%, AMD 12%-19% if we talk PC gpus in general.
https://www.statista.com/statistics/754557/worldwide-gpu-shipments-market-share-by-vendor/
Okay. It doesn't change the fact that when given a choice, people are purchasing nVidia GPU's with overwhelming preference.

But I think you also raise an excellent point here. People would also rather buy Intel GPU's over AMD ones. Anything but AMD.
you think, AMD sales the exact same number of 7600 if they sales if $300 instead of $270, you know that for a fact but they do not know it and they throw money for every sales they make, same for all their models currently they could all be significantly more expensive, they would not loose a sales ? That what you really mean ?
No, the pricing of AMD GPU's doesn't matter, because people will simply buy an nVidia GPU at at least a 5:1 ratio regardless of AMD's pricing strategy. Including and up to "selling" GPU's for zero dollars. Again, because a free GPU is insufficient if the end-user requires any form of nVidia tech, reiterating again, such as CUDA.

It is not sufficient that AMD creates or has a competing tech. It has to literally be the exact same tech in order for them to compete. And licensing all of nVidia tech would simply make them less competitive in terms of cost. There is a no win strategy for AMD here.
 
but you'd still rather have an nVidia card if price was no object
No. Fake frames hold no interest for me, and ray-tracing only a little. The raster performance is plenty good enough for me.

And I know my card isn't going to make a bid difference, but if AMD bows out, nVidia will jack up the price of their cards to make the mining-era prices look mild. "Never buy anything but nVidia" types will deserve that, but I don't think I do.
 
No. Fake frames hold no interest for me, and ray-tracing only a little. The raster performance is plenty good enough for me.

And I know my card isn't going to make a bid difference, but if AMD bows out, nVidia will jack up the price of their cards to make the mining-era prices look mild. "Never buy anything but nVidia" types will deserve that, but I don't think I do.
I don’t think you understand what “price no object” means. Literally you can have anything you want and you’d choose an inferior item. That either makes you illogical, not understanding the question at hand, or….

There are zero people price no object that wouldn’t have a 4090. There is literally no one on this planet that wants a gaming card that would want anything else. Someone might try to argue power, size, etc, but in this scenario none of those things are going by price. And nVidia wins there anyway.
 
AMD is a fundamentally broken business.
No one wants an AMD card regardless of price. They could give away 7900XTX cards away for free and people would still refuse them and would rather pay $1600 for 4060 Tis.

Spending big bricks of text to make statements to the contrary doesn’t change their undesirability.
says you.
 
Not sure I follow you at all here, DLSS or FSR reduce input lag a lot versus native (like most thing that augment frame rate). Not sure what you mean by placebo effect, in fast passed multiplayer shooter reflex can reduce latency by 33% and more:

View attachment 591401


In both the compared case ,DLSS2 and DLSS3 are not running the game at true 4K so it has nothing to do with that, I do not say lesser than native without reflex on.
It depends on how badly the games input lag is already. If the game already has an input lag of 30ms or higher, then DLSS and FSR can improve it. If the game has 10ms then DLSS and FSR can make it worse. Higher frame rates do generally decrease input lag, and a lot of modern games are really stressing modern GPU's so of course a lot of them have an input lag of 30ms or higher. Interestingly it seems Nvidia Reflex does what I suggested and that's limit frame rate.
https://blog.ishitatsuy.uk/post/latencyflex/
 
It depends on how badly the games input lag is already. If the game already has an input lag of 30ms or higher, then DLSS and FSR can improve it. If the game has 10ms then DLSS and FSR can make it worse. Higher frame rates do generally decrease input lag, and a lot of modern games are really stressing modern GPU's so of course a lot of them have an input lag of 30ms or higher. Interestingly it seems Nvidia Reflex does what I suggested and that's limit frame rate.
https://blog.ishitatsuy.uk/post/latencyflex/
Not quite limit, reflex requires that frame markers get coded in and it works more like a specialized thread director.
If it limited frame rates you would see that in benchmarks clearly.
Reflex monitors the games main thread, as well as the simulation, render, and presentation threads. It then works to remove bottlenecks, as it adds markers to the commands so it can process things out of order and just re arrange them for final presentation.
Reflex works to reduce situations where you have a pinned CPU core that is messing up the bunch.
 
Nvidia Reflex does what I suggested and that's limit frame rate.
It is a bit of a dynamic limit frame limiter (what the highest value we can go to have that no wait on the gpu effect) yes but a bit better than that having intelligence in the different thread of the game engine, the impact on the FPS can be close to nill in some scenario and can even beat a lower lock frame rate, you can see it in the video provided in your own link:


View: https://youtu.be/QzmoLJwS6eQ?t=463
 
I have been an NVidia user since the tnt2 days. I switched this time to AMD, price being the main causing factor.

I have not noticed any problems with the AMD drivers to date. Most of the cause for black screen, CTD, etc. with an AMD cards has to do with an unstable system oc.

I have noticed one space where the AMD card does do a better job and that is the VSR vs DSR front. With NV I had quite a number of games where the UI would not scale properly and make DSR unusable. With the AMD VSR I have not found one game where the UI does not scale properly. I am personally looking forward to seeing what AMD announces next.

Amd does have some work to do with their feature set. However the one thing that I do appreciate from the AMD side is everything is open source (for now).

The only thing that I am truly bummed about is how bad the OC functions of the card are locked down, but that will be a non issue here soon as I do have an Elmore Labs EVC2SE on order.

I am looking forward to seeing what they announce in the near future, as I have a few systems that need upgraded. I think that both AMD and NV have work to do, as with all things in the business world though competition is good for the consumer.
 
AMD is a fundamentally broken business.
No one wants an AMD card regardless of price. They could give away 7900XTX cards away for free and people would still refuse them and would rather pay $1600 for 4060 Tis.

Spending big bricks of text to make statements to the contrary doesn’t change their undesirability.

No. But Nvidia can sell lesser for more in terms of pure performance, their software makes up for it. But if a notable price difference occurs we'll see a gradual shift in market share. It won't happen over night. Example, if AMD had something that performed as fast as an RTX 4070 with 16GB of VRAM and was priced at $400 they would sell a lot of those. DLSS is nice, but if you can spend 50% less money and get something that performs 25% faster than Nvidia's offering at the same price point the DLSS part of the equation becomes a moot point.

AMD would also need to bring these cards out in a timely manner before half the customers buy Nvidia. AMD's current strategy is to price at or just below Nvidia, while coming out later. Just coming out later puts them at a disadvantage. AMD needs to be first or very quick and offer something very compelling price wise to make a dent in Nvidia.

I assume the problem is with Nvidia's current hold on the market, they can weather price drops better than AMD. If AMD triggers a price war, Nvidia can likely match even if they are unhappy. Lets say AMD's RTX 4070 performance peer with 16GB at $400 did come out. Nvidia would likely counter by dropping the 4070 to $450. And once again, most people will continue to buy the Nvidia offering due to drivers, software and familiarity. AMD's sales likely wouldn't improve but they would have thinner margins. And I assume Nvidia can better handle thinner margins than AMD due to volume and other non-gaming products/services.

Those are assumptions and I could be entirely wrong.
 
I assume the problem is with Nvidia's current hold on the market, they can weather price drops better than AMD. If AMD triggers a price war, Nvidia can likely match even if they are unhappy. Lets say AMD's RTX 4070 performance peer with 16GB at $400 did come out. Nvidia would likely counter by dropping the 4070 to $450. And once again, most people will continue to buy the Nvidia offering due to drivers, software and familiarity. AMD's sales likely wouldn't improve but they would have thinner margins. And I assume Nvidia can better handle thinner margins than AMD due to volume and other non-gaming products/services.
Yes, I think, this is the reason AMD is avoiding a price war with nvidia.

However they could get a better mindshare, if they release their budget cards, atleast 6 months before nvidia.

Maybe once the RDNA 2 stocks are sold out, they should then prioritize getting the budget RDNA 4 cards out by July/August next year (Nvidia won't have anything in response for atleast an year)
 
Again, the numbers speak for themselves. These are excuses. They aren’t good enough for anyone to actually buy.

That’s the bottom line.
I agree. At least until AMD drops their prices. The 7xxx series costs way too much for how it performs.

However, I got a 6700 xt this past November for a good price. Quite happy with that purchase.
 
I don’t think you understand what “price no object” means.
That's OK, I don't think you're right. You clearly don't want to admit the concept of "I wouldn't buy that brand no matter what", even though you are constantly saying nobody wants AMD cards.

While I admit I am not one of the "wouldn't buy an nVidia card at any price" people, I don't want one. I don't need the performance in the games I play. I don't care enough about ray tracing or fake frames. A 4090 at any price would be wasted on me. I think I know my needs and desires better than you do, so perhaps stop pretending that's not the case.
 
No. But Nvidia can sell lesser for more in terms of pure performance, their software makes up for it. But if a notable price difference occurs we'll see a gradual shift in market share. It won't happen over night. Example, if AMD had something that performed as fast as an RTX 4070 with 16GB of VRAM and was priced at $400 they would sell a lot of those. DLSS is nice, but if you can spend 50% less money and get something that performs 25% faster than Nvidia's offering at the same price point the DLSS part of the equation becomes a moot point.

AMD would also need to bring these cards out in a timely manner before half the customers buy Nvidia. AMD's current strategy is to price at or just below Nvidia, while coming out later. Just coming out later puts them at a disadvantage. AMD needs to be first or very quick and offer something very compelling price wise to make a dent in Nvidia.

I assume the problem is with Nvidia's current hold on the market, they can weather price drops better than AMD. If AMD triggers a price war, Nvidia can likely match even if they are unhappy. Lets say AMD's RTX 4070 performance peer with 16GB at $400 did come out. Nvidia would likely counter by dropping the 4070 to $450. And once again, most people will continue to buy the Nvidia offering due to drivers, software and familiarity. AMD's sales likely wouldn't improve but they would have thinner margins. And I assume Nvidia can better handle thinner margins than AMD due to volume and other non-gaming products/services.

Those are assumptions and I could be entirely wrong.
AMD has been the budget brand for 20+ years and there hasn’t been movement there even with any differences in price.

That's OK, I don't think you're right. You clearly don't want to admit the concept of "I wouldn't buy that brand no matter what", even though you are constantly saying nobody wants AMD cards.

While I admit I am not one of the "wouldn't buy an nVidia card at any price" people, I don't want one. I don't need the performance in the games I play. I don't care enough about ray tracing or fake frames. A 4090 at any price would be wasted on me. I think I know my needs and desires better than you do, so perhaps stop pretending that's not the case.
Okay. There is literally zero logic behind not wanting 40% more raster performance than the 7900xtx. “Wasted” or not, in this scenario it’s free.

Your rational is based on your personal feelings. Not because one of the options isn’t obviously superior. It’s objectively the case.

And humans not acting in their own best interest isn’t exactly something new.
 
“Wasted” or not, in this scenario it’s free.
Well, other than an extra 200W of heat. In coastal Texas. In the middle of a heatwave. Using your argument I'd be stupid not to buy a Bugatti for tooling around in town, just because it's free.
 
AMD would also need to bring these cards out in a timely manner before half the customers buy Nvidia. AMD's current strategy is to price at or just below Nvidia, while coming out later. Just coming out later puts them at a disadvantage. AMD needs to be first or very quick and offer something very compelling price wise to make a dent in Nvidia.
AMD isn't releasing new graphic cards due to them having an overstock of older cards. It made sense for them to lower the prices of their RDNA2 cards. They are selling well, but the problem is the RTX 30 series sold even better. Good enough for Nvidia to produce RTX 40 series.
I assume the problem is with Nvidia's current hold on the market, they can weather price drops better than AMD. If AMD triggers a price war, Nvidia can likely match even if they are unhappy. Lets say AMD's RTX 4070 performance peer with 16GB at $400 did come out. Nvidia would likely counter by dropping the 4070 to $450. And once again, most people will continue to buy the Nvidia offering due to drivers, software and familiarity. AMD's sales likely wouldn't improve but they would have thinner margins. And I assume Nvidia can better handle thinner margins than AMD due to volume and other non-gaming products/services.
AMD could weather a price war better, because again AMD doesn't just make GPUs. The problem isn't that AMD couldn't weather a price war, but if the shareholders would want such a thing. Try having AMD explain to shareholders that for the next 2-3 years they may make hardly any profit from selling GPU's, but are thinking for the long term in trying to capture market share so they can eventually raise prices and actually make big bucks.

The other factor we're also missing here is Intel, and I do think Intel does plan to continue to price GPU's so low that Intel won't make money off them. I think Nvidia knows this and the rumor that Nvidia is trying to prevent Chinese GPU manufacturers from making Intel BattleMage GPU's is probably due to Nvidia knowing that Intel is ready to go broke for market share. Unlike Nvidia, AMD doesn't have the fanboys to stick with them.
 
Well, other than an extra 200W of heat. In coastal Texas. In the middle of a heatwave. Using your argument I'd be stupid not to buy a Bugatti for tooling around in town, just because it's free.
You could easily downclock the 4090, get all of the software features and still come out ahead in terms of every conceivable metric other than the physical size of the card. However if you want to play that game than nVidia has a better card for performance per watt at every level. Or water cooling.

There isn’t a scenario there that AMD comes out ahead on when cost isn’t a factor. Not one.

And obviously your analogy doesn’t hold up here. As there can’t be a discussion about changing efficiency of automobiles; at least not in a way that can be done with a software slider.
 
Last edited:
AMD could weather a price war better, because again AMD doesn't just make GPUs. The problem isn't that AMD couldn't weather a price war,
Both statement sound fishy to me, it would need to include AMD winning a price war in the datacenter side of things, Nvidia is probably by now a larger company outside the customer PC discrete GPU than AMD as a whole.

AMD just had 2 quarter in a row with negative operating income, Nvidia made 3.4 billion in operating income the last 2 quarter.

AMD had won market share during the late 2020 to 2021 gpu demand craze with people buying what was available, went down immediately once builder-customer had the ability to choose again.

Well, other than an extra 200W

4090 is among the lowest wattage you can have, if you lock the framerate you should be 20-30% less power used than a 6800, heat issue is a good reason to get a Lovelace card.
 
AMD has their stuff priced exactly where it needs to be, just not where we want it to be.
AMD generally sells pretty much all the silicon they produce, and very rarely has left over stock gathering dust on shelves.
The RX6000 series is the glaring exception to that rule and AMD sees to make it go away. You have to understand that the overstock on the 6000 series was so huge that it the already printed silicon was almost equal to their entire expected sales for the following year, a full 100% over production.
Now that may be the fault of the AIB’s who were ordering that silicon as fast as they could to take advantage of the Crypto boom and it’s their problem to deal with. But AMD is not so large that they can tell the AIB’s off, they are partners after all, so their mistakes are AMD’s to aid in sorting out.
AMD’s solution is to cut back on the any product that directly competes against the abundant 6000 stock. AMD can easily transition that silicon to other products.
People give Nvidia flack for how they treat their AIBs, seeing them less like partners and more like a necessary annoyance and it’s for these very reasons. They make it difficult to control inventory when they order more than they can sell, and Nvidia is very careful about controlling that inventory.
 
What is it with GPU threads always devolving?

It's incontestable that NV has the best card out there with the 4090, but to argue that pricing is completely irrelavent seems absurd.

Not everyone can just shrug and buy a 4090, many of us have to budget for a GPU. AMD can't compete with Nvidia's mindshare they get from consistently holding the high-end, but there's plenty of people who ran the numbers, looked at the charts, and said "yeah, I'll buy a midrange AMD GPU because it's what I can afford and it's a decent price/performance ratio".

Being the absolute best isn't everything and AMD isn't trying to simply be the best outright desktop GPU manufacturer, they're just trying to make some GPUs to compliment the many many other segments of their business.

If 7800XT has a decent price/perf ratio, it'll sell even if it "sucks" in some ways compared to NV.
 
Last edited:
Back
Top