AMD is Allegedly Preparing Navi 31 GPU with Dual 80 CU Chiplet Design

Now that sounds like an interesting and potentially quite fun GPU to play with, if they can get the scaling on-card and not reliant on software.
sounds nice so far must wait long time.................
Probably got to stash enough full Navi chips.
 
I love these rumors. They can’t even supply their released high tier, haven’t even released the rest of the lineup... but hey look what we might be getting a year later!

How about we get leaks about what matters in the next 6 months...
 
Last edited:
I love these rumors. They can’t even supply their released high tier, haven’t even released the resto of the lineup... but hey look what we might be getting a year later!

How about we get leaks about what matters in the next 6 months...

What does supply have to do with r&d?
 
What does supply have to do with r&d?
1611556748219.png


The article hints at using two existing big-navi GPUs.
 
The article hints at using two existing big-navi GPUs.
Not sure how the distinction is big if there is one, but it is using 2 80 cue unit die has chiplet in a single GPUS design and not 2 GPU on the same card and old plan:
https://wccftech.com/amd-navi-gpus-not-using-mcm-feature-monolithic-die-radeon-rx-gaming-cards/

Has for people circulating rumors should only talk about the next few months and never about more long road ahead change, even super massive potential one like GPU taken something that look like Ryzen chiplet design and exploding possibility of scaling and yield and SOC, I get the exasperation of GPUs news when we cannot buy anything anyway right now, but that not fair to click money people.
 
I love these rumors. They can’t even supply their released high tier, haven’t even released the rest of the lineup... but hey look what we might be getting a year later!

How about we get leaks about what matters in the next 6 months...
Maybe AMD found another way to stimulate interest, stock prices, keeping in the news etc. Just feed tidbits, true or not to the endless number of mouths that will broadcast it all over the internet with others repeating it louder and louder. Cascade effect. Where were the leakers a year ago dealing with RNDA2 and Infinity Cache? Some of us guess that correctly way before these supposed informed broadcasters even knew or considered it.
 
How about supplying a fucking real life video card instead of this paper launch shit with very few GPU's.
Well if there is nothing to talk about, make stuff up seem to be coming around. The pace of major releases of huge performance gains on a yearly cadence is long gone and all the sites that use to do great reviews really just does not have much to talk about now. Even monitors are rather stagnant which is also an item that people already kept for 5 years +. I really don't see any new arch coming this year, just more RNDA2 and Ampere skews. As for Intel? Do they even count? Maybe late this year is my guess. Considering if one has a CPU from last or even older generation, the new generation will not give that much more in the end other than benchmark numbers.
 
That doesn't answer my question. So, does a company stop r&d because supply issues or keep on trucking?
because a single one of these chips is nearly impossible for AMD to supply, so why put the dollars into a project (including all the way up to production of engineering samples) that requires TWO of the chips? Even if AMD had the idea in a state of planning and simulation, the 7nm demand and supply issues would have been an obvious indicator that such a product would never be feasible, and it wouldn't have moved to the real-world testing phase.
 
because a single one of these chips is nearly impossible for AMD to supply,
They could plan that the issue would be largely fixed by 2022/next console upgrade when that kind of chiplet GPU get out.

And AMD is shipping via the PS5/Xbox many millions of big navi, not necessarily that 80 CU, but by then...
 
Last edited:
because a single one of these chips is nearly impossible for AMD to supply, so why put the dollars into a project (including all the way up to production of engineering samples) that requires TWO of the chips? Even if AMD had the idea in a state of planning and simulation, the 7nm demand and supply issues would have been an obvious indicator that such a product would never be feasible, and it wouldn't have moved to the real-world testing phase.

Lol, so you don't know how real world works, yet you want to post memes at me. R&d goes on or there would never be progress.
 
Let me see if I recall AMD financial call today. Supply will be constrain till June and should improve. Consoles are selling much better than expected, 1qtr 2021 sells will be much higher than expected for PC CPUs and GPUs because of pent up demand. New mobile design going to AIBs. Milan EPYC CPUs I believe is suppose to come out in 2n Qtr. AMD is expected overall growth in all sectors to be 37% for this year.

Going chiplet designed made magic for AMD in CPU's, one design that goes from small to the largest in the industry for CPUs. Would seem if possible AMD will repeat that type of design in GPU's if they could get it to work. This also allows a different process, cheaper process for an I/O die if it has one. Anyways a two chiplet design may have way better yields, be smaller overall due to i/o die configuration allowing more video cards to be made.
 
That doesn't answer my question. So, does a company stop r&d because supply issues or keep on trucking?
They would have to manage around supply constraints for any new design made. Answer is of course not.
 
Lol, so you don't know how real world works, yet you want to post memes at me. R&d goes on or there would never be progress.

You may have me confused with someone else, I'm not posting memes in an aggressive way, but sorry if you took offence. R&D is quite important, in fact, it's probably priority #1 at a tech company. However, there are several steps to the R&D process, and creating a physical, functioning sample is VERY late in the process. To create a functioning product, there need to be multiple teams developing and fabricating PCB layouts, writing functioning firmware and drivers etc. This costs the company a LOT of money and time, and as I said, this is a late-term step in an already expensive process. It's not just a 'hey let's test this idea, slap something together!'. There would be design goals meetings, market research, product design sprints, EXTENSIVE simulation and theoretical performance projections before anyone even touches a physical circuit board. And in some step in that process, someone at AMD would realise that they are not capable of supplying this product to market in a profitable way in any significant quantity in any time frame that makes sense. By the time the supply issues are resolved, the theoretical advantage of this product would be overshadowed by products fabricated using newer process nodes. The 5nm process is already ramping up, with multiple companies shipping 5nm ICs to consumers. I highly doubt AMD would pump expensive R&D into this product.


In other words I don't think this leak is legit.
 
Don't confuse supplying with buying. Did you see the inventory numbers? Way up. And as you have to know, the inventory is not sitting around for long.

I guess I'm more referring to supplying to meet demand. They are obviously fabbing and selling the product, but if they could increase their output by 300%, they'd still fly off the shelves.

So, to me, it makes no sense to create a product that uses two dies when they are already having tons of trouble maintaining supply of the product that only uses one.
 
if RX 7000 series (or whatever it ends up being called) isn't launching till 2022, it's probably going to be on 5nm, which means a 80CU Navi31 chip could be significantly smaller than Navi21 and thus easier to supply in volume.

Regardless of the quality of this particular leak, I think it's pretty much a foregone conclusion that AMD (and Nvidia, and Intel) are going to be moving to MCM GPUs on the high-end within the next few years.
 
  • Like
Reactions: N4CR
like this
I guess I'm more referring to supplying to meet demand. They are obviously fabbing and selling the product, but if they could increase their output by 300%, they'd still fly off the shelves.

So, to me, it makes no sense to create a product that uses two dies when they are already having tons of trouble maintaining supply of the product that only uses one.
Yeah, screw looking to the future until you have your current supply constraints figured out.

Glad you are not in charge and rather just a post on a forum.
 
So, to me, it makes no sense to create a product that uses two dies when they are already having tons of trouble maintaining supply of the product that only uses one.
If this were true, then AMD wouldn't be breaking Revenue records. And it's not just consoles. Like Nvidia, everything they make is sold. You think they are going to supply joe schmo over Origin, Dell, and other major PC builder companies? Besides, 6800 are available at Microcenter daily now near me. Problem, they are $350-400 over MSRP, but some knuckleheads keep buying them.
 
because a single one of these chips is nearly impossible for AMD to supply, so why put the dollars into a project (including all the way up to production of engineering samples) that requires TWO of the chips? Even if AMD had the idea in a state of planning and simulation, the 7nm demand and supply issues would have been an obvious indicator that such a product would never be feasible, and it wouldn't have moved to the real-world testing phase.
As more production moves to 5nm (Apple etc), this frees up 7nm capacity towards end of year/2022 for such efforts.
 
because a single one of these chips is nearly impossible for AMD to supply, so why put the dollars into a project (including all the way up to production of engineering samples) that requires TWO of the chips? Even if AMD had the idea in a state of planning and simulation, the 7nm demand and supply issues would have been an obvious indicator that such a product would never be feasible, and it wouldn't have moved to the real-world testing phase.
Higher margins? If you can sell one 6800 for 40% profit, or a double 6800 using less overall resources would theoretically sell for twice as much at a 50% profit margin and there's enough of a market for it, why wouldn't you work to research that? Look at the 5800x situation for a similar scenario. It's basically a higher end chip cut in half, but the higher end chips typically have a greater profit margin than lower end chips, so there was concern that a lower margin 5800x would eat up chips destined for higher margin CPUs. You'd want to max out sales of the 5950x or whatever is your higher margin chip first, but then the 5800x type single chip model gives you a lower end version that you can easily produce with the same resources once you have satisfied demand at the high end. But you don't want 5800x sales cannibalizing your limited supply of chips that are also used in higher margin chips, so you raise the 5800x price accordingly.

AMD has struggled in the past to stay competitive vs nVidia and intel so they need every edge they can get and the chiplet design seems like a good way to address that. It gives them the flexibility of multiple levels of hardware but it's a lot less design work. Honestly, with GPU tasks being highly parallel I think it makes a lot of sense, although I'm wary because of the issues both sides have had in the past with multi-GPU setups & drivers.
 
Higher margins? If you can sell one 6800 for 40% profit, or a double 6800 using less overall resources would theoretically sell for twice as much at a 50% profit margin and there's enough of a market for it, why wouldn't you work to research that? Look at the 5800x situation for a similar scenario. It's basically a higher end chip cut in half, but the higher end chips typically have a greater profit margin than lower end chips, so there was concern that a lower margin 5800x would eat up chips destined for higher margin CPUs. You'd want to max out sales of the 5950x or whatever is your higher margin chip first, but then the 5800x type single chip model gives you a lower end version that you can easily produce with the same resources once you have satisfied demand at the high end. But you don't want 5800x sales cannibalizing your limited supply of chips that are also used in higher margin chips, so you raise the 5800x price accordingly.

AMD has struggled in the past to stay competitive vs nVidia and intel so they need every edge they can get and the chiplet design seems like a good way to address that. It gives them the flexibility of multiple levels of hardware but it's a lot less design work. Honestly, with GPU tasks being highly parallel I think it makes a lot of sense, although I'm wary because of the issues both sides have had in the past with multi-GPU setups & drivers.

I completely agree that chiplet/MCM is the future for Gpus, I just don't think this 'leak' is legit.
 
Back
Top