RTX 4xxx / RX 7xxx speculation

Nebell

2[H]4U
Joined
Jul 20, 2015
Messages
2,379
Continuing on the thread from a couple of years ago.
I was pretty close with my guess, 3080Ti is exactly 45% faster than 2080Ti, at 4k.

But I did not expect AMD to be this competive. 6900XT was slower than RTX 3090, but the new 6900XTXH is faster.

So if we ignore the refreshes coming early 2022, what do you think is going to happen with the next gen?

It's going to be harder to guess now.
Time for both AMD and Nvidia to start pulling rabbits out of the hat while Intel is chipping at the low to mid-end GPU market. Nvidia is also going from 8nm Samsung to 5nm TSMC.
But I think we can expect even higher gains. Somewhere in the range of 60-80% faster for RTX 4080Ti compared to 3080Ti (at 4k resolution).
 
Last edited:
I'm sure they're both going to be ray tracing beasts and I think they'll be roughly the same in terms of power (trading blows here and there).

I thought both teams had decent reference coolers (AMD especially improved their coolers from previous generations), so I'm expecting that to continue.

I've got high hopes for Intel, I'm hoping they'll shake up the lower end enough so that Nvidia and AMD will start taking that category seriously again.
 
Rumors are that the 40xx series cards from Nvidia will be 4x faster at RT & consume much more power than the 30xx series cards. Nvidia has been notoriously cursed with "4", everyone remembers the GTX 480.

But none of that matters, you won't be able to buy one.
 
I predict that nvidia will keep producing LHR cards and AMD will go back to HBM2 memory and we'll see a lot of overlap between their next gen HPC cards and gaming cards (I hope).
 
Last edited:
The latest rumors are twice the performance for twice the power on RTX 4xxx.
I find that hard to believe. Both claims.
Yes, competition is what drives innovation, but from a business perspective (and these are private business companies), it makes no sense.
Even if AMD and Nvidia were in a fierce fight, one would hardly believe they would go for twice the performance If one sets 50% increased performance, the other will try and beat that by 5%. And then they start with refreshes to milk more money :)
I mean, look at the current gen. Why is it some 45% faster than the previous gen? Why not 100% because there's certainly space for that kind of jump. The companies are not innovating, the tech is already there, they are just delaying releases.

And going from 8nm Samsung (which is inferior to TSMC) to 5nm TSMC and then drawing twice as much power. WTF is that rumor? It can't be a gaming card.
 
The latest rumors are twice the performance for twice the power on RTX 4xxx.
I find that hard to believe. Both claims.
Yes, competition is what drives innovation, but from a business perspective (and these are private business companies), it makes no sense.
Even if AMD and Nvidia were in a fierce fight, one would hardly believe they would go for twice the performance If one sets 50% increased performance, the other will try and beat that by 5%. And then they start with refreshes to milk more money :)
I mean, look at the current gen. Why is it some 45% faster than the previous gen? Why not 100% because there's certainly space for that kind of jump. The companies are not innovating, the tech is already there, they are just delaying releases.

And going from 8nm Samsung (which is inferior to TSMC) to 5nm TSMC and then drawing twice as much power. WTF is that rumor? It can't be a gaming card.

I think they both are giving us everything they have at each release. Nvidia was hoping to curb stomp AMD but couldn't except in RT, and the 3090 is a massive die with massive power consumption. I think they were hurt by lack of tsmc capacity, which allowed AMD to almost catch up as they have the process advantage. That looks to be going away next gen, but they will have had time to catch up in RT performance and figure out MCM - which could be what lets them pass Nvidia, at least in raster performance. If they can get close with RT they may actually have the advantage next gen. I'm not sure what you think Nvidia is holding back, if they could make a smaller and more efficient die, it'd be in their best interest to do so as they could get a lot more chips per wafer and make a ton more money.
 
I'm not sure what you think Nvidia is holding back, if they could make a smaller and more efficient die, it'd be in their best interest to do so as they could get a lot more chips per wafer and make a ton more money.

They are holding back. I have a pretty good feel about that.
They nailed the 3080 and 3090 performance. No they are not 20% faster than 6800XT and 6900XT, but just marginally faster. And surprise, their 3080/3090 announcement was timed well also.
If AMD cards were say 15% / 30% (6800XT/6900XT) faster than 2080Ti, then I bet you'd see 3080/3090 at 20% / 35% and not 35% - 50% as is now (at 4k).
Nvidia simply did what they needed to do to edge AMD. And I'm sure they are ready for the next gen as well, with insane prices in mind. Because idiots like me will pay €1400 for RTX 3080.
 
They always hold back, they have probably two or three generations of graphics technology in the pipe but they release them in increments.
 
Rumors are that the 40xx series cards from Nvidia will be 4x faster at RT & consume much more power than the 30xx series cards. Nvidia has been notoriously cursed with "4", everyone remembers the GTX 480.

But none of that matters, you won't be able to buy one.
Only if you're too young. The 4600/4400/4200 cards from ~2002 are legendary. But yeah, you might not be able to buy NV's next "4" card.
 
Doubt we see hbm memory on any consumer graphics card ever again, unless the price gets close to gddr. AMD's infinity cache has done a pretty good job, if they can double it on the next gen cards, they should be good with gddr6.
 
Doubt we see hbm memory on any consumer graphics card ever again, unless the price gets close to gddr. AMD's infinity cache has done a pretty good job, if they can double it on the next gen cards, they should be good with gddr6.
HBM cost are coming down (well maybe not lately). HBM 3 single stack can push over 800mb/s. It comes down to cost of board supporting gddr6 at what bus width compared to cost of interposer/HBM which next generation of AMD GPUs will have an interposer I do believe. So what is cheaper? Complicated board using gddr6(x), wide bus, or faster HBM, few stacks, lower power, easier to cool? I would not rule it out for the upper end consumer cards.
 
You could be right. And if card prices stay as high as they are now, it becomes that more possible that hbm could make sense.
 
I predict that nvidia will keep producing LHR cards and AMD will go back to HBM2 memory and we'll see a lot of overlap between their next gen HPC cards and gaming cards (I hope).
AMD has moved completely away from HBM or any existing $$ (DDR6X?) memory due to their infinity cache and chiplet (soon for GPU) tech for consumer GPU's.
 
Only if you're too young. The 4600/4400/4200 cards from ~2002 are legendary. But yeah, you might not be able to buy NV's next "4" card.
Not sure if the 4000-series were "legendary", bit they definitely were venerable. My 4400 handled Jedi Academy really well, of I recall. And to be fair, the 480 performed admirably for the time as long as you had the PSU and cooling to handle.

As for the new 4000-series, it's like discussing which Star Trek starship is best -- totally academic and imaginary. They're going to be priced through the roof from the start, and even then you won't get one.
 

It's more out of nesecity, you have to plan a lot in advance and hope that fabrication is where it needs to be for what you have designed. As a result if the competition has something amazing they could in theory skip a design to close the gap if those chips can be made. this might also result in very bad availability die to low yields even in non pandemic times.
 
Was the 3090 a $1499 MSRP? 4090 will MSRP at least double.

So yeah...my speculation, hard to get and super expensive.
 
yeah for the minor bump in camera spec, rounded edges, all glass design, word sheeple exists for such people.im happy with my S20 Ultra went from a Nexus 6 i held onto until it died a very slow death. ill wait till there is a rdna 2 gpu in the next phone before upgrading.
 
If Nvidia switches back to TSMC, I’m expecting a good lead against AMD. Ampere is still faster at 4k despite them cheapening out with Samsung. At a node parity, it’s an easy curb stomp.
 
It's more out of nesecity, you have to plan a lot in advance and hope that fabrication is where it needs to be for what you have designed. As a result if the competition has something amazing they could in theory skip a design to close the gap if those chips can be made. this might also result in very bad availability die to low yields even in non pandemic times.
It is not holding back if it cannot be realistically mass manufactured at a possible price (and watt consumption)

If the claim is, they could have made something that cost even more and consume even more (and reduce their margin a lot or go for a really bigger price), that not holding back, that choosing a price and thermal envelope on a process that they can actually deliver, that they have theorical in the work of better future design is really different from holding back.

A console being an extreme example of that, obviously sony and microsoft could have made a better performance console and they look at each other and not over shoot, but they are hold back by the thermal-price-size-reliability and ability to delivers 10 of millions on the final design, under that envelope they do the best they can and I doubt would accept that AMD keep a better system (that they could have made) but that keep has a plan B in case of Intel-Nvidia surprise release. Competition and third party strength both seem way too big for that.
 
If Nvidia switches back to TSMC, I’m expecting a good lead against AMD. Ampere is still faster at 4k despite them cheapening out with Samsung. At a node parity, it’s an easy curb stomp.
Can you imagine how much worse the card supply situation would be of Nvidia hadn't gone with Samsung? I imagine that the decision to go with Samsung was made long before covid/crypto situations, but its good they did.
 
If Nvidia switches back to TSMC, I’m expecting a good lead against AMD. Ampere is still faster at 4k despite them cheapening out with Samsung. At a node parity, it’s an easy curb stomp.
Interesting. I think AMD will completely destroy Nvidia next generation. I think they will be able to position their cards against Nvidia at any place they want. I see nothing from Nvidia that will compete with a multiple chiplet graphics card from AMD. Seems like they can easily do 2X 6900XT performance or more. Nvidia is just not going to see that bump. If the 2X speed 2X power rumor is true I think a 500+ W GPU is a really hard sell. Like close to impossible. I do expect that the chips shortage will have created new baseline prices for GPUS. Expect $1500+ list on the 4080/7800 level SKUs next Gen.
 
I do expect that the chips shortage will have created new baseline prices for GPUS. Expect $1500+ list on the 4080/7800 level SKUs next Gen.

Not so sure about that, maybe if shortages persist, but even then, around my part of the world, there are lot's of GPU"s on stock in stores for sillly prices like MSRP x2 and they keep beeing in stock for like the last 6+ months, while they may be selling some, they do not sell out in seconds anymore like they did at lauch.

There are only so many people that are willing to pay stupid prices for hardware and I'm not so sure that they are going to keep doing so, maybe in a couple generations when their hardware can't keep up anymore but next gen? I doubt it will be as silly as this gen.
 
if there is another way of free covid money like there was here (homeless guys got $11,500 checks) then i can see it being a problem if that happens. when you get that kinda money you can pay whatever anyone asks because its free.
 
Interesting. I think AMD will completely destroy Nvidia next generation. I think they will be able to position their cards against Nvidia at any place they want. I see nothing from Nvidia that will compete with a multiple chiplet graphics card from AMD. Seems like they can easily do 2X 6900XT performance or more. Nvidia is just not going to see that bump. If the 2X speed 2X power rumor is true I think a 500+ W GPU is a really hard sell. Like close to impossible. I do expect that the chips shortage will have created new baseline prices for GPUS. Expect $1500+ list on the 4080/7800 level SKUs next Gen.
I tend to agree with you on AMD likely having the top card in the next batch. 500W+ also seems likely. The catch is it'll probably be multiple chiplets, so more like a ThreadRipper or EPYC than a desktop Ryzen with up to two chiplets. Bye bye CrossFire, hello chiplets and mega GPUs. The rumors I've seen so far are for a 2-chiplet card, but I bet they're just starting with two. Really high end stuff doesn't sell in big numbers. First priority is probably to beat NV in the midrange where most of the money is and knock off the 3090. They'll probably also keep producing monolithic designs for lower end stuff. Chiplets are more expensive than a monolithic design that's small enough. 2 dies instead of one and higher packaging costs. If the monolithic design is large enough the improved yields from multiple smaller dies makes chiplets worthwhile, but if it isn't the yield gap isn't big enough to cover the extra cost of multiple chiplets and more complicated packaging.

At any rate what I'm expecting is some expensive monster cards from AMD and NV doing a usual generational improvement and building some power guzzlers in an attempt to keep up. AMD will build power guzzlers too, but at least they'll be really fast. So it'll be kind of like what Intel and AMD have done when the other one gets ahead. Intel's been juicing the power the last few generations, and remember the Athlon FX-9590 @ 220W TDP? Same sort of thing also happened with Netburst (P4 era) vs. Opteron/Athlon 64.
 
How do you cool a 500w card especially in the summer. Coolers are already triple fan 2.75 slot wide monsters. Its not like everyone can accomdate aio solutions either and epecially so when so many are using that space for cpu aio. You would need a 360 aio to properly cool a 500w tdp gpu. I dont think it will be the norm at all bit we might see it for some aib cards
 
How do you cool a 500w card especially in the summer. Coolers are already triple fan 2.75 slot wide monsters. Its not like everyone can accomdate aio solutions either and epecially so when so many are using that space for cpu aio. You would need a 360 aio to properly cool a 500w tdp gpu. I dont think it will be the norm at all bit we might see it for some aib cards
STH has a good video on this. AMD already has a 2 chiplet GPU for HPC uses that consumes a ton of power and Nvidia has similar 500-600W GPUs. I think you'd need water-cooling for it, especially in a desktop case where you can't run fans at super fast speeds due to the obscene noise.
 
Rumors are that the 40xx series cards from Nvidia will be 4x faster at RT & consume much more power than the 30xx series cards. Nvidia has been notoriously cursed with "4", everyone remembers the GTX 480.

But none of that matters, you won't be able to buy one.

It's a good clue that clock speeds will be a lot higher if power consumption will be so high. Nvidia will be getting some of their gains through frequency increases rather than IPC. I would expect the next refresh on TSMC to use less power as they get more experience with these new TSMC nodes, but the first one will be gulping power.
 
I don't see much changing really other than AMD optimizing more than Nvidia on the open source side and more btu to manage and oh 4K fps increase of near 30%. Hopefully they wont catch fire lol.
 
well cant be like the nvidia fire monsters first fermi then 2000s early models melting 8pins. never heard amds hmb cards like the nanos catching fire only coil whine and leaky pumps.
 
I don't see how a desktop GPU (aka 4090) can be $3,000. That is Titan level "prosumer" product. $1,999? That seems more likely IMO.
 
I don't see how a desktop GPU (aka 4090) can be $3,000. That is Titan level "prosumer" product. $1,999? That seems more likely IMO.
People are paying close to that for 3090s. What makes you think a substantial number of people/suckers won't pay north of 3k for a supposed big jump in performance. I'm clearly not one of those, but don't kid yourself to think they won't fly off the shelves at over 3k
 
Probably next year. Winter is usually the time for the new series right?
 
next year late fall/winter.

~15% raster improvement from nVidia, they'll again toute their dlss and more ray tracing (prob 30% gain in rtx) to make up for the abysmal gains.

AMD may edge out a bit more, their architecture atm seems more flexible and competitive (if a little immature) compaired to nVidia.
 
They are holding back. I have a pretty good feel about that.
They nailed the 3080 and 3090 performance. No they are not 20% faster than 6800XT and 6900XT, but just marginally faster. And surprise, their 3080/3090 announcement was timed well also.
If AMD cards were say 15% / 30% (6800XT/6900XT) faster than 2080Ti, then I bet you'd see 3080/3090 at 20% / 35% and not 35% - 50% as is now (at 4k).
Nvidia simply did what they needed to do to edge AMD. And I'm sure they are ready for the next gen as well, with insane prices in mind. Because idiots like me will pay €1400 for RTX 3080.
lol, according to any reputable source they (nvidia) panicked and duct tapped together a solution (with insane power draw and thermals) to continue to compete with AMD. nVidia had nothing left in the tank for the 3000 series.

I don't expect AMD to rout nVidia, but I do expect more from AMD than nVidia at this point.
 
Interesting. I think AMD will completely destroy Nvidia next generation. I think they will be able to position their cards against Nvidia at any place they want. I see nothing from Nvidia that will compete with a multiple chiplet graphics card from AMD. Seems like they can easily do 2X 6900XT performance or more. Nvidia is just not going to see that bump. If the 2X speed 2X power rumor is true I think a 500+ W GPU is a really hard sell. Like close to impossible. I do expect that the chips shortage will have created new baseline prices for GPUS. Expect $1500+ list on the 4080/7800 level SKUs next Gen.
I think all company (Intel-nivida-sony, etc...) are working on a chiplet, not just AMD and I could be naïve but it sound to me extremely challenging and not easy (would have been done I imagine if that was not the case by how much improving by same size/watt has slowed down and the complexity reached).

https://research.nvidia.com/sites/default/files/publications/ISCA_2017_MCMGPU.pdf
Nvidia MCM-GPU: Multi-Chip-Module GPUs for Continued Performance Scalability

https://www.techspot.com/article/1975-intel-xe-preview-v2/

~15% raster improvement from nVidia, they'll again toute their dlss and more ray tracing (prob 30% gain in rtx) to make up for the abysmal gains.

Would that not be low ? It seem that going to tsmc 5nm it they decide to kept them has really high power hungry would almost mean 0 improvement...

Last time it was often in the 30%-40%, well not obvious to tell with all the sku going around what to compare from generation to generation.

Average_4K.png

the "4k card":
2080ti->3080ti: +40%
2080 -> 3080: +70%

the "1440p card""
Average_1440p-p.webp


2070 super->3070: +33%
2070->3070: +58%
https://www.techspot.com/review/2155-geforce-rtx-3060-ti/
2060 super -> 3060ti: +50%
 
Back
Top