VideoCardz: AMD to introduce Zen3 on October 8, Radeon RX 6000 series on October 28

And this is based on what? More YT rumors?

Well you win this one. Because no way you will ever find it out from nvidia lol and anyone willing to go on record without staying anonymous.

Mooreslawisdead posted an article and you can check it out. You don’t have to believe it but you won’t also ever find out directly from nvidia. But given how nvidia loves their margins it’s makes sense to push AIB cards more and get your margins that way.
 
I do think there was a lot of over-reaction on cooler costs. NVidia FE coolers likely do cost a bit more than AIB coolers of similar size. Remember the claims that card prices were going to be high because the coolers were so expensive? :rolleyes:

But I don't think the 3080 FE cooler is any more expensive than the 2080 FE cooler. Similar materials usage, looks like similar design complexity. 3090 FE cooler is probably a bit more because it's significantly bigger.

It's a non factor in the overall scheme of things. NVidia obviously has an advantage over AIBs because they effectively get chips at cost, since they manufacture them, and they will want to sell FE cards because they capture more revenue on their own cards than just selling the chips. Not only that, but it lets them have a good showing if AIBs drop the ball on cooling.


Your argument about nvidia getting chips at cost really isn’t a good one. They are not paying extra to sell chips to AIB. They are still making them at cost. They make their margins that way since AIB likely sell more and they have to do 0 work. Just sell the chip make their margins and not having to worry about manufacturing anything else at all. That is the best part about it. AIB pay bill of material to nvidia. That is the easiest sale they are going to make.

When was the last time AIB for Nvidia dropped the ball on cooling? It’s always been generally better with after market cards.
 
Your argument about nvidia getting chips at cost really isn’t a good one. They are not paying extra to sell chips to AIB. They are still making them at cost. They make their margins that way since AIB likely sell more and they have to do 0 work. Just sell the chip make their margins and not having to worry about manufacturing anything else at all. That is the best part about it. AIB pay bill of material to nvidia. That is the easiest sale they are going to make.

When was the last time AIB for Nvidia dropped the ball on cooling? It’s always been generally better with after market cards.

AIBs obviously don't pay cost though, so they are at a disadvantage competing against NVidia for card sales.

When NVidia sells their own card, they make revenue and profit for the Chip and the card. When they sell the chip to an AIB they only get chip revenue and profit.

Revenue and profit from selling both the card and chip will obviously be higher than selling the chip alone.

NVidia is should be very happy sell all the FE cards they can, and increase revenues and profits higher than selling chips alone.
 
AIBs obviously don't pay cost though, so they are at a disadvantage competing against NVidia for card sales.

When NVidia sells their own card, they make revenue and profit for the Chip and the card. When they sell the chip to an AIB they only get chip revenue and profit.

Revenue and profit from selling both the card and chip will obviously be higher than selling the chip alone.

NVidia is should be very happy sell all the FE cards they can, and increase revenues and profits higher than selling chips alone.

Who cares, since this has ZERO to do with the AMD RDNA2 release. :) Considering that AMD has a lock with TSMC, at least from from I can see, they should have no issue producing enough, unless the Zen 3 takes off even more than expected. I am pleased to see that competition is most definitely happening and Raja is no longer there to screw it up. :D
 
AMD’s Next-Gen Radeon RX 6000 Series ‘RDNA 2’ Graphics Cards Alleged Benchmarks Leak Out
As interesting and potentially upsetting as the flagship cards are shaping up to be, this quote is what I'm hoping leads to big, well, specifically little things:

"RDNA 2 architecture goes through the entire stack. It will go from mainstream GPUs all the way up to the enthusiasts and then the architecture also goes into the game console products ... as well as our integrated APU products."

There's nothing stopping AMD from producing an APU with PS5 or Xbox X/S graphics outside of packaging and thermals. If AMD made a single socket solution that could offer the same graphics power as even a cut-down console APU it would be the de-facto go-to choice for just about everyone. Not because it's exciting, but because it's safe.

Now picture that as a mobile part. Safe, on what these parts appear to offer, could still be 60fps 1080p gaming on fresh titles. Maybe a bit memory bandwidth limited, middle-of-the-road settings, but achievable.

Ryzen mobile APUs punch well above their weight class. I wouldn't mind seeing a "Big Zen" APU down the line.
 
Hmmm, I wonder which one this is? And no, before an endless debate happens, I doubt this is the top card that they are releasing. Now, the real question is which ones are they going to release for purchase on the 28th of October?

One of the mid range SKUs es sample. “Not an apple fan” doesn’t really mention leaks but he was losing his shit with these confusing leaks lol from what he has heard lol. He said he will only say that none of the AIB have high end test SKUs 0. Amd is keeping those in house. It makes full sense if we are not getting any AIB cards until late December or January and they are supplying them for first few months.
 
but because it's safe.
I went laptop shopping over the holiday weekend. I looked at Dell's XPS 15, and Razer's equivalents, but wound up with a refurb Latitude as a holdover*.

While that's got very little to do with this thread, the thing that does is that I saw AMD APUs being cited by reviewers as being problematic for video editing.

The Latitude I wound up with is a 5410, and has an i7-10610U, which is another Skylake refresh in four core eight thread configuration. What I'd really like instead is something with a similar configuration but with a 4800U to chew through workloads. Sadly that doesn't exist, but even if it did, I'd still be concerned about application compatibility.


*I'll probably get something when the next round of CPUs and GPUs hit mobile; two things that I find essential, that I couldn't get in any 'normal' Dell, would be 120Hz or faster as well as VRR of some sort. I'll say that I was pretty disappointed that even many AMD APU sporting laptops also didn't have VRR listed for whatever reason; of course, neither did the Dells, and that should really be one of the highlights!
 
There's nothing stopping AMD from producing an APU with PS5 or Xbox X/S graphics outside of packaging and thermals. If AMD made a single socket solution that could offer the same graphics power as even a cut-down console APU it would be the de-facto go-to choice for just about everyone. Not because it's exciting, but because it's safe.

Now picture that as a mobile part. Safe, on what these parts appear to offer, could still be 60fps 1080p gaming on fresh titles. Maybe a bit memory bandwidth limited, middle-of-the-road settings, but achievable.

Ryzen mobile APUs punch well above their weight class. I wouldn't mind seeing a "Big Zen" APU down the line.

As per this leaked roadmap Zen3+RDNA2 apu (Rembrandt) is expected by end of next year

And in 2022, even desktop CPUs (Raphael) are expected to come with igp, like all intel CPUs today

If I got this right

Desktop Roadmap:
  • Matisse — (zen-2 no igp AM4 socket) current version
  • Vermeer — (Zen-3 no igp AM4 socket) expected by end of this year
  • Warhol — (zen-3+ no igp AM5 socket?) expected next year
  • Raphael — (zen-4 + Navi RDNA(2?) AM5 socket?) expected in 2022
Mobile Roadmap
  • Renoir — (Zen-2 + optimised Vega) released in beginning of this year
  • Cezanne — (Zen-3 + further optimised Vega) expected end of this year!?
  • Rembrandt — (Zen-3 + Navi RDNA2) expected by end of next year
Ultra Portable Roadmap
  • None so far
  • Van Gogh — (Zen2 + Navi RDNA2) expected by end of this year
  • Dragon Crest? — ( Zen(4?) + Navi RDNA(3?) ) expected by end of next year

EDIT:
(updating with this information below)

CoffeeFox (@CoffeeFox3) Tweeted:
https://twitter.com/CoffeeFox3/status/1297225481331175424?s=20

https://hardforum.com/threads/amd-ryzen-in-2021-warhol-van-gogh-and-cezanne.2000402/post-1044696431
 
Rumored for quite a while that Zen 3 CCX is unified 8 cores. This should really help kill the latency advantage Intel has for gaming. Hope this one is true.


I hope they have an easy toggle for the 16 core model to swap to 8 cores on games that scale on 8 cores or less. ( Which is most) However there is one game out there that eats cores that just came out, I can't remember what that is. Man if Zen 3 ccx is indeed unified 8 cores this will be a versatile CPU alright. I bet they are going to get creative with the CCXs this time though. 4700 vanilla 4+4 harvested from bad 4950x and 4700x has one 8 cores die. So people can't get the same performance for less through overclocking. Just like the 4 core Zen 2 CPUs.
 
Well if you go by xbox series x numbers and just take those clocks here is what it looks like after calculating Tflops since RDNA and turing are really fairly close in Tflops to gaming performance.

52 CU = 3060 competitor. On par or little faster than 2080 super.

60 CU = 2080ti Competitor and 3070 competitor.

This is based of xbox game clocks for both. If indeed AMD is pushing 2GH+ on these on final cards then you are looking at even faster for those two SKUs.

May be higher clocks will be reserved to 6800 and 6900 series. But You are looking at very competitive scenario when it comes to raster performance. Only way AMD doesn't compete at 3080 level is they just stop at 60CU and don't make anything higher. Which I don't believe will be the case given lisa SU has talked about leadership performance full stack top to bottom. Also calling big navi top end halo product.

Now it remains to be seen average performance of 3080. But looking like 25-30% > 2080ti on average from leaked benches. But I think you are looking at 6800 that sits in between 3070 and 3080. 6900 that actually sits 10% or so above 3080. Hence you likely will see nvidia refresh with 3070ti and 3080ti.
 
Last edited:
There's nothing stopping AMD from producing an APU with PS5 or Xbox X/S graphics outside of packaging and thermals. If AMD made a single socket solution that could offer the same graphics power as even a cut-down console APU it would be the de-facto go-to choice for just about everyone. Not because it's exciting, but because it's safe.

There's also market segmentation, power supply and memory bandwith.
 
if they have a card faster than 3080 and with 16GB of RAM vs 10GB, they shouldn't need to sell it for less.
why not. that's a good way to make sales and gain market share. oh yeah and take back to the days when x70 cards were $300 and x80 cards were $500. Lets not forget about the way our economy is going. If amd pulled something like that they would prob gain more fans than already have while also being regarded as saviors of PC gaming. You do know that once they work out the architecture the cards don't cost them anymore to make now than they did before. Robot's do most of the work now anyway.
 
why not. that's a good way to make sales and gain market share. oh yeah and take back to the days when x70 cards were $300 and x80 cards were $500. Lets not forget about the way our economy is going. If amd pulled something like that they would prob gain more fans than already have while also being regarded as saviors of PC gaming. You do know that once they work out the architecture the cards don't cost them anymore to make now than they did before. Robot's do most of the work now anyway.

Because GPUs cost more to make these days, and it now costs > 100 million dollars in upfront costs to start manufacturing a new series, so they need decent margins to recoup those costs as quickly as possible.

Recoup costs too slowly and you are looking at a new GPU series, before the old one is profitable.
 
Nvidia doesnt need to disrupt a market they pretty much own.

own? i wouldn't go THAT far... don't forget every console gamer in the world is running Radeon and prob 25%(??) of PC gamers. So actually more people game on Radeon than Nvidia when it comes down to it.

I know i will be waiting at least till Oct 8 to make my decision. AMD may just have an ace up their sleeve? Nvidia may know this and trying to get to market 1st to make up for the sales they MAY lose to amd when performance numbers, especially performance per $, reports hit the fan.

interesting times we are living in.
 
own? i wouldn't go THAT far... don't forget every console gamer in the world is running Radeon and prob 25%(??) of PC gamers. So actually more people game on Radeon than Nvidia when it comes down to it.

I know i will be waiting at least till Oct 8 to make my decision. AMD may just have an ace up their sleeve? Nvidia may know this and trying to get to market 1st to make up for the sales they MAY lose to amd when performance numbers, especially performance per $, reports hit the fan.

interesting times we are living in.
Every console gamer in the world? I highly doubt that statistic seeing as NVIDIA makes the GPU for the switch has over 25% of the Console market. So if we go by current gen consoles (I assume you weren't talking about older gen with your statement or the #'s would be even further off). So, it's more like 75% of all console gamers are running Radeon... which doesn't sound near as good as you're sentence, but, I prefer accuracy over making things sound good ;). I understand the point, but still, not many PS4 owners are looking at upgrading the GPU in the console to a 30x0 or "Big Navi", so it's not really a metric that matters for PC GPU's.
 
With two out of three of those artificial.
Maybe I'm missing it, but what two are artificial?
Either you're trying to say no, everyone is willing to pay any price (market segmentation)
Or you're trying to say that power is no issue and you can build a GPU that uses 0 watts and performs better than the 3090?
Or you're saying memory bandwidth isn't a limitation and there is no reason to ever use more than 1-lane wide memory structures?

I'm really not sure what you were intending to say. I'm probably just taking something to literally.
 
Sorry.

Market segmentation and power usage are artificial, within reason, to a point. Once costs are covered, segmentation is up to marketing and competition. Same with power use; you can, again, within reason, to a point, bump up power usage to make a part run how you want it to run.

I think the best current example of this is Nvidia's mobile 1000- and 2000-series parts. There's a wide range of market segmentation in order to market a spread of performance, while the actual performance isn't set by the chips themselves, but the power and cooling available to the parts.

There are upper cooling and performance limits that limit the amount of segmentation and power use options, but within those boundaries, those points are set by deliberate choice. Those points are artificial.
 
Sorry.

Market segmentation and power usage are artificial, within reason, to a point. Once costs are covered, segmentation is up to marketing and competition. Same with power use; you can, again, within reason, to a point, bump up power usage to make a part run how you want it to run.

I think the best current example of this is Nvidia's mobile 1000- and 2000-series parts. There's a wide range of market segmentation in order to market a spread of performance, while the actual performance isn't set by the chips themselves, but the power and cooling available to the parts.

There are upper cooling and performance limits that limit the amount of segmentation and power use options, but within those boundaries, those points are set by deliberate choice. Those points are artificial.

This is about APUs though. Ultimately these are lowest common denominator parts. They need small dies to be cheap enough for high volume.

Of course you can make a specialized high end APU with a big GPU, those are what Microsoft and Sony get for their consoles.

But now you have a much more expensive niche part, that won't work in normal motherboards.

AMD could have built something like this for ages, but they didn't which implies they don't see enough volume to justify it.

Too many people assume what they would buy is what everyone would buy, so no the Expensive Niche big GPU part is not a universal desire. Most people just want a cheap laptop.
 
This is about APUs though. Ultimately these are lowest common denominator parts. They need small dies to be cheap enough for high volume.

I'm not so sure about that anymore. There are builders out there already making entry-level gaming devices with AMD APUs.

I think an APU that could deliver 1080p gaming with modern titles at meh or alright detail at a reasonable FPS wouldn't be "entry level," it would be "de-facto," making up for volume.

This isn't unpossible for AMD. They have mobile parts with 8 cores, 16 threads, and 8 graphics cores, at 45 watts (15W ultra-portable!). They should be able to do 8/16+20 like with the Xbox S and put it in mainstream packaging. Even 8/16+16 or 6/12+20 or ... there's no way that doesn't dominate on the entry level for gaming on desktop or laptop.
 
That's just the world, my friend. These customers have friends who they play online with. They have youtubers they watch, They have streamers who they look up to, They have IT departments who advise them.

If you're a kid who gets a new PC as birthday present, and All your friends run Nvidia cards, and YOU run an AMD card, and when joining your friends online in a game, and YOU crash and nobody else has difficulty: You're going to get shit on by your friends for being poor and getting the poor man's computer with tons of issues. If your friends have issues and you don't: you're just lucky.

If you're an adult who has maybe two hours a WEEK to game and you just want something that is ready to go, and you see on youtube people talking about 5700XT black-screen issues: what are you going to choose? Yeah, these issues are almost all resolved
...
...
...
but Nvidia cards never had these issues. You don't want to spend those precious few hours you have troubleshooting. So when you buy a new PC (something you may do only every 5-6 years) you go with the thing that you feel will give you the LEAST amount of grief for that half-decade you're going to be stuck with it.


Face it. The only way AMD can win in this environment is by being perfect. anything less than 100% perfection will be met with distrust.

I call BS on NV supremacy and AMD inferiority, so why is your rig AMD CPU and GPU then?

Main Rig: "Threasurrection" ■ WC Ryzen TR 2950X ■ Vega56 (sold the 2080Ti)
 
It reminds me a bit of my reference based 6950 but it should much quieter with the 3 fans and internal venting versus the blower fan on the 6950.

I like the way it looks and it looks like there is an actually good heatsink on the card, this time. (That said, I have no issues with the Reference Vega 56 and RX5700 that I own.)
 
I still would have prefered if they changed the direction of those fins to run length wise down the card to vent out the back instead of sideways back into the case like that.
 
I like the way it looks and it looks like there is an actually good heatsink on the card, this time. (That said, I have no issues with the Reference Vega 56 and RX5700 that I own.)

Actual hard overclocking did a video on the cooler. Sounded like buildzoid lol, I only know him by voice wasnt sure if he runs that channel. He looked under the card form fortnite angles. Might be nothing but he was suggesting this might suggest HBM on top end cars given the screw holes on the side because gddr cards don’t have that and all AMD HBM cards do.

Lol. So let’s see.
 
https://videocardz.com/newz/amd-showcases-radeon-rx-6000-graphics-card-design


well that looks interesting atleast. Looks like higher end Navi with 2 8pin connectors. I wouldn't mind that card in my rig.
How do you plug that into your system? There are no bracket or video ports... it looks like something someone made up and put online. Why would they block the front and back from ventilation. That heatsink doesn't look like it leaves much room for a PCB, although from the angle it's hard to tell on this point. This does not look like a great render. This is something I would make in blender while trying to learn how to use the shortcuts, not something a large company designed and intends to build. It honestly looks better/more realistic in Fortnite, lol
 
How do you plug that into your system? There are no bracket or video ports... it looks like something someone made up and put online. Why would they block the front and back from ventilation. That heatsink doesn't look like it leaves much room for a PCB, although from the angle it's hard to tell on this point. This does not look like a great render. This is something I would make in blender while trying to learn how to use the shortcuts, not something a large company designed and intends to build. It honestly looks better/more realistic in Fortnite, lol

Well they are just showing the front side with bracket so it looks straight i guess. Just a render of what it would look like may be? Who knows but looks promising.
 
Well they are just showing the front side with bracket so it looks straight i guess. Just a render of what it would look like may be? Who knows but looks promising.

Its a render of the cooler only.
 
So a bit unrelated but still relevant to Big Navi, the PS 5 SoC is having major yield issues with only 50% being viable so Sony had to cut orders to 11m units from 15m. That's a pretty substantial cut and considering the SoC is a Zen 2 + Navi 2x, could this spell trouble for Big Navi which should have much bigger dies than the PS5 APU? Relevant article: https://www.bloomberg.co.jp/news/articles/2020-09-15/QGFJPPDWLU6M01



If this happened to Sony, then MS with their bigger X Series might also be suffering from the same yield issues which would mean they will push Series S really hard this Christmas. So what does all this mean for the RX 6000 series? I would think since all of these guys are using the same TSMC process, then AMD won't be immune to the yield problems either and they might not have enough Big Navi available and we'll end up with a paper launch. If this happens, it will be incredibly ironic if NVIDIAs decision to go with Samsung results in better yields and more products on shelves considering so many so-called insiders speculated that NVIDIA would have shortages (first it was Samsung yields and now it's coolers, they can't make up their minds).

There was a recent rumor that Sony increased the CU count at the last minute before manufacture which would account for decreased yields.
 
I would highly doubt that.

An old report from earlier this year stated that the GPU has 40 CUs, and then later that 4 were disabled. I don't see it beyond possibility that Sony decided to enable the 4 CUs they had disabled to increase yields considering they are faced with a much more powerful Xbox.
 
There was a recent rumor that Sony increased the CU count at the last minute before manufacture which would account for decreased yields.

Consoles are custom so may be there Wilma be yield issues earlier. Might be just particular to Sony because we haven’t hear anything about Xbox.

But when it comes to straight GPU yields are likely going to be way higher because you are not building an APU which I am sure is more sophisticated and probably reduces yields further.
 
Back
Top