AMD’s upcoming flagship GPUs should be 3x faster than RX 6900XT

Should be but it won't be. Well they will pull some bs marketing metric that no one cares about.
 
*****Pulls up chair, grabs bag of popcorn...grabs beer....turns on pr0n on tv...and waits.....
 
If this is when AMD move to chiplets in GPUs, I could actually see larger than normal performance ceiling rise on the high-end- but I wouldn't believe for a moment that performance in the midrange or perf/$ in general will go up by anything remotely close to 3x. If the top-end RX 7000 or whatever is somehow 2.5-3x as fast as 6900XT by virtue of chiplet scaling on top of IPC & clock gainz, it'll also be an even higher performance tier (7990?) and cost more accordingly. Feel free to quote the entirety of this post next year to see how I did lol.

Going by current msrp

6900 XT > $999

7990 XT > $2499
 
Going by current msrp

6900 XT > $999

7990 XT > $2499
I'm thinking along those lines too

7800XT = 80CU (single chiplet) ~20% faster than 6900XT in raster, ~50% faster in RT, ~799$
7900XT = 120CU (dual chiplet) ~70% faster than 6900XT in raster, ~150% faster in RT, ~1199$-1499$
7990XT = 160CU (dual chiplet) ~100% faster than 6900XT in raster, ~200% faster in RT, ~1999$-2499$

disclaimer: all performance numbers are pulled out my arse, please no-one post these on WCCF claiming confirmed performance leak thanx
 
Last edited:
Eh, they have a different solution going. Basically the system only sees one GPU, which is the I/O die. The I/O die tells the chiplets what to do, and takes the info they handle and present it to the system. Essentially, it's just one big GPU even if it's hosted on multiple physical dies.

I don't know if they work like one big die or if they split up roles, like SFR back in the day. But however it'll work, it's a black box as far as the system is concerned.
Last paper I read from NVidia from like 2 years ago was they were working on how to lower the latency between the IO die and the chiplets to improve frame time issues. I have to assume that they all had similar issues and have been working on it all this time. 3 years seems like a reasonable timeframe to tackle this issue.
 
Last paper I read from NVidia from like 2 years ago was they were working on how to lower the latency between the IO die and the chiplets to improve frame time issues.

Makes it sound like AFR is back on the table...
 
Makes it sound like AFR is back on the table...
geez I hope not... NV and AMD spent the entire 2010's insisting that "~this time~ AFR frame pacing, scaling, and compatibility is fixed!" and it never really was. I'm hoping for more like the arrangement on Ryzen 3000/5000 where the IO die holds the last-level cache and scheduler and treats the compute dies as a single pool of threads.
 
I'm hoping for more like the arrangement on Ryzen 3000/5000 where the IO die holds the last-level cache and scheduler and treats the compute dies as a single pool of threads.

So far that's the only thing I've heard about RDNA chiplets. It'd be kinda funny if Nvidia and AMD come to completely different solutions using the same chiplet concept.
 
And you cant buy one


This x100. As more and more folks link their rig(s) to places like Nicehash,it will only get worse. We are at least 5 years away before some of these new announced Fabs are online and pumping out Wafers.

Crypto's are not going away this time, but instead are about to become the basis of the entire worlds financial system,and then some. Same way post cards,and record stores and dvd rental stores and kodak,et all went the way of the do do bird,so very very soon,will paper money,banks,Brinks armored cars.... and credit cards go away. Bretton Woods "2"/Agenda 2020 is about to turn it all on a dime,all part of the Great Reset. The most disruptive tech ever to be revealed is coming along with it,all on the Blockchain and all tied to a UN Controlled Chinese style Social Credit Score on steroids (Check the id2020 website/technology started by Bill Gates/WEF/DARPA,YES that Darpa.../BIS/ECB/IBM),rolled out during the Lockdowns so that the expected violent public protests cant be held,for "safety" reasons of course..... "Its just two weeks to flatten the curve".

Soon,your online behavior, will openly count towards IF you'll be allowed to buy/or even enter a lottery,for that RTX4060 or RX7900XT card. No,I suspect many will stay on whatever card they have now,as for a laundry list of new reasons many of,most have not thought of yet,it will make it much harder to obtain new tech like a video card,no matter the MSRP. Do your own research folks, Act accordingly.
 
Well-reputed leaker on AMD’s chips @Greymon55 has shared his speculated specifications of the RDNA 3 dies.

Greymon55 (@greymon55) Tweeted:
Some information and a few guesses:
31=
~60WGP
~16GB 256bit GDDR6
~256M 3D IFC
~2.5GHz

32=
~40WGP
~12GB 192bit GDDR6
~192M 3D IFC
~2.6~2.8GHz

33=
~16WGP
~8GB 128bit GDDR6
~64M IFC
~2.8~3.0GHz
https://twitter.com/greymon55/status/1471693761579130888?s=20

Via HardwareTimes https://www.hardwaretimes.com/amd-r...-7800-xt-and-rx-7700-xt-specifications-rumor/
 
Yeah but the new flagships "leaked" from both AMD and NVidia show them to be MCM designs with almost 2x the silicon by surface area and almost 2x the power draw over the existing lineup.
 
Yeah but the new flagships "leaked" from both AMD and NVidia show them to be MCM designs with almost 2x the silicon by surface area and almost 2x the power draw over the existing lineup.

Only AMD is going with chiplets in the next round. Nvidia's will be monolithic. Power envelope either way is expected to be huge, though.
 
Only AMD is going with chiplets in the next round. Nvidia's will be monolithic. Power envelope either way is expected to be huge, though.
A little disappointed that Nvidia is staying monolithic, but yeah both AMD and NVidia are supposed to be 600w + flagships. That is gonna get spicy.
 
  • Like
Reactions: travm
like this
Everything I've heard, seen, or read says RTX4000 is gonna be a gjjjeeeeyyiiiinooormous die, but that this will be the last monolithic flagship from Nvidia.

And that AMD has two flagship RX7000 designs in the works, one's insane, and the other one's insaner.

The rumors point to AMD probably running with the slightly more sane version, but it depends on how much power Nvidia thinks they can get away with using. They want to take the crown for sure this time and with the chiplet advantage this might be their best chance.

I've been paying more attention to upcoming product rumors lately since I've pretty much ruled out putting together a complete new rig this generation.
 
I'm thinking along those lines too

7800XT = 80CU (single chiplet) ~20% faster than 6900XT in raster, ~50% faster in RT, ~799$
7900XT = 120CU (dual chiplet) ~70% faster than 6900XT in raster, ~150% faster in RT, ~1199$-1499$
7990XT = 160CU (dual chiplet) ~100% faster than 6900XT in raster, ~200% faster in RT, ~1999$-2499$

disclaimer: all performance numbers are pulled out my arse, please no-one post these on WCCF claiming confirmed performance leak thanx
So they'll still be one generation behind NVIDIA with ray tracing performance.
 
So they'll still be one generation behind NVIDIA with ray tracing performance.

I expect they'll brute force their way to decent ray tracing with the 7000-series and add accelerators in 8000. By that time, Nvidia will also be on chiplets, at which point, I kind of expect both companies to handle ray tracing with dedicated parts. Not like, dual GPUs, but rather, have discrete chips for different functions all under the same heatspreader.

That's just a guess, but it lines up with all the rumors and patent filings.
 
...for the handful of games that actually take advantage of it 3 years after launch...
If Raytracing wasn't here to stay, AMD wouldn't be spending money trying to improve theirs.

And it's the intergenerational improvement that's commendable, not where they stand in relation to Nvidia. Raytracing isn't zero sum, everyone benefits when there's competition to do it better and faster. RT has always been the graphical holy grail.

Raja Koduri of Intel was recently spotted watching raytracing videos on YouTube, and then commenting "whoa, that's so cool, how does it do that?" So Intel is probably thinking about RT now as well.
 
Last edited:
Intel supports ray tracing on their new GPUs but I doubt the performance is as good as Nvidia (maybe better than AMD, but that's not saying much).
 
If Raytracing wasn't here to stay, AMD wouldn't be spending money trying to improve theirs.

And it's the intergenerational improvement that's commendable, not where they stand in relation to Nvidia. Raytracing isn't zero sum, everyone benefits when there's competition to do it better and faster. RT has always been the graphical holy grail.

Raja Koduri of Intel was recently spotted watching raytracing videos on YouTube, and then commenting "whoa, that's so cool, how does it do that?" So Intel is probably thinking about RT now as well.
New development middlewares are leaning heavily into Ray Tracing as a means of decreasing development times and complexities. So much of game "optimization" (I hate that term) is tweaking texture maps, shadows, and assets to maximize visual impact while minimizing the performance hit for using them, the new tools are instead taking raw high definition assets with their movie quality textures and using software and AI to apply the necessary optimizations to the assets for the desired platform then lean into the RTX libraries for shadows, reflections, and other visual goodies. Yeah, it requires the RTX and all the Ray Tracing goodies, but it replaces 1000's of hours of manual labor with a button push and gives out very consistent results. Using RTX saves developers very real money on a project while giving better visuals which is really what PC gaming is all about.
 
A little disappointed that Nvidia is staying monolithic, but yeah both AMD and NVidia are supposed to be 600w + flagships. That is gonna get spicy.

Yeah. How long before external flagship gpus (with their own psu) become standard?
 
Yeah. How long before external flagship gpus (with their own psu) become standard?
I don't know but at 600w I if I could put it under my desk opposite my Tower it would be a great replacement for the Honeywell ceramic heater I use in there to keep my toes warm.
 
New development middlewares are leaning heavily into Ray Tracing as a means of decreasing development times and complexities. So much of game "optimization" (I hate that term) is tweaking texture maps, shadows, and assets to maximize visual impact while minimizing the performance hit for using them, the new tools are instead taking raw high definition assets with their movie quality textures and using software and AI to apply the necessary optimizations to the assets for the desired platform then lean into the RTX libraries for shadows, reflections, and other visual goodies. Yeah, it requires the RTX and all the Ray Tracing goodies, but it replaces 1000's of hours of manual labor with a button push and gives out very consistent results. Using RTX saves developers very real money on a project while giving better visuals which is really what PC gaming is all about.
Yeah, I remember that being specifically mentioned in the Metro Exodus Enhanced Edition deep dive vid; there was an especially poignant part where a dev demonstrates the rasterized lighting workflow and spends a significant amount of time there tweaking light and shadow maps, adding invisible light sources for fill lighting, tweaking it more, iterating. Then for the RTX workflow for the same scene it was just "I tell the game engine that light bulbs are light bulbs and it does the rest"
 
Might as well be 1000x faster, I'm keeping my 2070 at this point. Screw both AMD and nVidia for treating their customers the way they've been doing it for the past year!

Supply issues aren’t either companies fault nor. Same with prices. The insane mark ups have all come from retailers and AIBs. AMD and Nvidia can control what retailers charge for cards directly from them (which is why Nvidia FE cards sell at MSRP whenever they appear) but they have no control over what AIBs charge nor what retailers charger for AIB cards. Blame COVID complications combined with what would have been an inevitable supply issue due to massively increasing demand for parts from every industry.
 
I cant get excited anymore for gpus, its ruined for me , hope one day they'll sort it out with actually being able to buy at (reasonable) MSRP
Yep. Considering you can only reliably sort-of kind-of get a 2060 level in performance type card.. Anything newer/better I just don't care about it. They're insanely difficult/too expensive to acquire.
 
3090's are in stock here in New Zealand for US$3,000.

so glad I didn't sell the 2080ti. What in the actual fuck though.

And I am the sort of person who would love to have a problem of cooling a 600w card. Next gen better have the goods and competitive pricing.
 
Maybe the power draw will be enough to put a dent in mining profitability.
 
Maybe the power draw will be enough to put a dent in mining profitability.
You would hope, but if they use 2x the juice while being 3x faster then technically it’s an electrical decrease and a profit increase.
 
Back
Top