RTX 5xxx / RX 8xxx speculation

Not every card is going to cost over $1000 though. The 5080 might get the price raised back up to $1200 to match the 4080's initial launch price but everything else under it is going to be less than $1,000. If Nvidia starts resting on their laurels then that only allows AMD an opportunity to take the performance crown and I'm sure Mr. Leather Jacket would never allow that to happen.

I feel a 5080 priced over 1,000 bucks will be a dust collector for retailers, sort of like the current 4080. Honestly I was a bit surprised how well the 4090 sold, so that may still do well at a higher price. Profit is far more important to the leather jacket right now, otherwise stock value might drop.
 
A $1,199 5080 that goes +15% the 4090 if the 5090 is $2000 and has 24 GB of VRAM could still sell..... specially if it has over 1600 TFLOPS at FP8 / 3000 peak INT TOPS

There is no 4080 super at msrp on newegg outside the PNY-Zotac brands I think

Everything at msrp is sold out at Best Buy:
https://www.bestbuy.com/site/searchpage.jsp?id=pcat17071&qp=gpusv_facet=Graphics Processing Unit (GPU)~NVIDIA GeForce RTX 4080 SUPER&sp=+currentprice skuidsaas&st=4080+super
I'd be shocked if it were more than 16 gigs. It'll sell out regardless, if it doesn't Nvidia is more than happy to sit on them until they sell. I don't expect them to lower prices until demand drops in the AI sector.
 
A $1,199 5080 that goes +15% the 4090 if the 5090 is $2000 and has 24 GB of VRAM could still sell..... specially if it has over 1600 TFLOPS at FP8 / 3000 peak INT TOPS

There is no 4080 super at msrp on newegg outside the PNY-Zotac brands I think

Everything at msrp is sold out at Best Buy:
https://www.bestbuy.com/site/searchpage.jsp?id=pcat17071&qp=gpusv_facet=Graphics Processing Unit (GPU)~NVIDIA GeForce RTX 4080 SUPER&sp=+currentprice skuidsaas&st=4080+super

The super 4080's definitely sold, buy the regular 4080's never really sold well. But I just think the upcoming 5000 series will likely lack the performance to really push anyone to upgrade except the few that have to have the best, especially at 1,000 plus for it. Amazon shows stock for the 4080 supers from different brands. Will see though, were all just speculating based on the little information out there.
 
You do know that the 980 Ti and 780 Ti are the same node process (28nm) yet the 980 Ti delivered pretty sizeable gains right?
From the GTC 2024 keynote (at ~57 minutes), he shows the Inference performance increase of Blackwell over Hooper/Ada, it's 30x faster:
1713904765150.png


I believe the Blackwell is the 2 GPU's combined into 1 unit. Plus he attributes a portion of this increase due to the crazy fast interconnectivity.

So if we also suppose that the 30X increase is also a best case scenario or cherry picked workload, lets say a more realistic increase is 24X, to compare as a single die lets cut that in half to 12x. If it's getting a 50% boost from the interconnects, lets drop it to 8x. That's still huge. This is Hopper, which is the datacenter part that is parallel to Ada gaming GPU's.

No way to know how this will translate to gaming performance improvements. It is the same TSMC 4N process. Hell if the performance of the gaming GPU is double the 4xxx, that will be nuts.

Historic performance increases over the old gen's, and the process nodes they were built on:
780Ti to 980Ti 41% - As pointed out both of these were the same 28nm process node. Pretty damn impressive.
980Ti to 1080Ti 75% 16nm
1080Ti to 2080Ti 41% 12nm
2080Ti to 3090Ti 56% 8nm Samsung process derived from their 10nm process
3090 to 4090 was 35% TSMC 4N
3090Ti to 4090 was 27% TSMC 4N

So here's to being optimistic that the increase will exceed the historic increases.
 
Last edited:
it's 30x faster:
This is comparing FP4 operation per second with FP8 for the other gen, I think ( the other chips like the H200 are almost multi GPU) and vast part of the boost is from the complete ecosystem like you say.

It is not a big indicative value because its comparing different operations and obviously specialized into doing 4 bits hardware will run them much faster than 8bits operation.


https://www.semianalysis.com/p/nvidia-blackwell-perf-tco-analysis

32-16-8 bits is more
H100->B100: +77% is more representative than +3000%, now is it lower than if they did not put a lot of hardware on 4 bits... could not tell you
 
I figured it was cherry picked in some way, all presentations and PR material you gotta expect that.

77% is much more believable. Still an amazing uplift as well. Sounds like it could match the 9xx to 10xx generational performance increases.
 
77% is much more believable. Still an amazing uplift as well.
H100 was 2.5-3 time the A100 on those metric for a recent reference, will have to see but if the B100 is a bigger chips and using more power... it shows that the stagnant node could mean a low raw upgrade and for Blackwell to be a very short lived stop gap and for the next gen to be in 2025 has rumoured.
 
We can hope that by the 6xxx generation, that the will be on a 3nm process.
 
I said 77% of their graphics business, not overall business. That accounts for $10 billion of that $13.5 billion in your chart.

That is a ton smaller then the 47.5 billion they made on other products, like AI chips. What would you do when you are facing a capacity strained TSMC and need to pick a priority on what to produce?
 
Blackwell will be faster (duh) but top card 5090 will have less Nvram..with the nvram shortage and the premium pricing it can get for professional cards, gaming cards will have less Nvram. The only ones looking for lots nvram on consumer cards are ML/AI part-time reasearchers or small startups that try to cut cost by not buying professional lvl similar cards.
Remember the whole crypto debacle , where the newer the card the less efficient it was in mining
 
That is a ton smaller then the 47.5 billion they made on other products, like AI chips. What would you do when you are facing a capacity strained TSMC and need to pick a priority on what to produce?
There 2 main option, they can do like they did with Ampere Samsung 8nm for the gaming product, TSMC 7NM for the commercial product.

They can do like what all the rumors seem to point out, launch the AI chips before the gaming product having a 100% allocation for the AI chips for some months.

A mix of both could become common, has I imagine there is not just purely the TSMC part, but the substrate and other element of the supply chain, some high end chips on a different node than the rest, the H100 type having the very best and the L40 and the gaming affair for the missed one on a different node.

The rumours for 2025 would be an extreme example of that, the rumors are that Blackwell replacement would be as early as 2025 and on NVIDIA edition of TSMC 3N (or better), so the AI cards could have more of a full year all alone on the latest node before the 2026 gaming card new gen.
 
5090 is gonna cost 3k minimum. My guess. And if thats true....I will hold onto my 4080super for a lonnnngggg time.

1k for a video card is enough for me. I don't game at that level anymore. I'd rather be a socially active and outdoors person as I near 50. I do not want to spend three THOUSAND dollars on a video card that keeps me away from society.

I guess I am just not into gaming that much anymore at those prices. Have fun whoever spends north of 2k for a simple video card. I'm not.
 
5090 is gonna cost 3k minimum. My guess. And if thats true....I will hold onto my 4080super for a lonnnngggg time.

1k for a video card is enough for me. I don't game at that level anymore. I'd rather be a socially active and outdoors person as I near 50. I do not want to spend three THOUSAND dollars on a video card that keeps me away from society.

I guess I am just not into gaming that much anymore at those prices. Have fun whoever spends north of 2k for a simple video card. I'm not.
Same here, I just cannot get into PC gaming with top end video cards costing 2k, that used to be what the whole PC cost.

Good thing consoles are still reasonable.
 
Same here, I just cannot get into PC gaming with top end video cards costing 2k, that used to be what the whole PC cost.

Good thing consoles are still reasonable.

You can always downgrade monitor resolution. I think 4070 Ti/Super are great alternatives for 1440p. I play with a 4090 laptop at 4k/120hz.
Of course not all games get to those 120 fps, but I'm pretty happy, DLSS is available in many games and I have nothing against that tech (or FG). My eyes only see the difference in fps increase.
 
Same here, I just cannot get into PC gaming with top end video cards costing 2k, that used to be what the whole PC cost.

Good thing consoles are still reasonable.

You do not have to buy the top end card in order to have a good experience on PC. My nephew bought a 7900 XT for $650 and it gives him an experience far superior to consoles while the only major loss on PC vs a top end 4090 is RT performance which he doesn't care about. You used to be able to argue for DLSS on nvidia but now that XeSS 1.3 is out that argument has become less valid.
 
You do not have to buy the top end card in order to have a good experience on PC. My nephew bought a 7900 XT for $650 and it gives him an experience far superior to consoles while the only major loss on PC vs a top end 4090 is RT performance which he doesn't care about. You used to be able to argue for DLSS on nvidia but now that XeSS 1.3 is out that argument has become less valid.
AMD eww..that card in 3 years wont have any compatible drivers.
 

Both predicting that windows 12 will exist, be quite different enough for win11 drivers to not work and that AMD will have stopped to make dGPU and drivers by 3 years... even for their latest gpus.

Will see... how that turn out... but
ip1j5.jpg
 
  • Like
Reactions: Axman
like this
You do not have to buy the top end card in order to have a good experience on PC. My nephew bought a 7900 XT for $650 and it gives him an experience far superior to consoles while the only major loss on PC vs a top end 4090 is RT performance which he doesn't care about. You used to be able to argue for DLSS on nvidia but now that XeSS 1.3 is out that argument has become less valid.
I actually argue if you're a "PC guy" getting the top-end parts, your experience is probably pretty terrible. Nobody who builds their own PC, who also spends tons of money on top-end parts is simply plopping them in, installing drivers and playing games happily. We tinker, test, hack, tweak, adjust... and the level of performance is never enough!

If you get mid-range 'bang for buck' parts, you're more likely to set-and-forget. 4K 240FPS ULTRA is not what you expect, and you don't lose sleep if you can't hit it. No sense in tinkering or delving into minute settings.

That's completely different for trust fund kids who just drop $7000 on a pre-built PC to play Helldivers with their bros when the weather isn't good enough to take their Lamborghini out. They probably have a swell time, because they don't really care enough to tinker.
 
I actually argue if you're a "PC guy" getting the top-end parts, your experience is probably pretty terrible. Nobody who builds their own PC, who also spends tons of money on top-end parts is simply plopping them in, installing drivers and playing games happily. We tinker, test, hack, tweak, adjust... and the level of performance is never enough!

If you get mid-range 'bang for buck' parts, you're more likely to set-and-forget. 4K 240FPS ULTRA is not what you expect, and you don't lose sleep if you can't hit it. No sense in tinkering or delving into minute settings.

That's completely different for trust fund kids who just drop $7000 on a pre-built PC to play Helldivers with their bros when the weather isn't good enough to take their Lamborghini out. They probably have a swell time, because they don't really care enough to tinker.

I only tinker on the first day or two then after that I never bother with it again. Seeing so many people talk about GPUs being too expensive so PC gaming is going to die or something like that is just absolute non sense though. Nobody needs to have a 4090 in order to enjoy PC gaming, it ain't like anything below a 4090 is now suddenly in unplayable territory. You can just spent $400-$600 for a GPU and get a PC gaming experience that beats consoles.
 
I'm waiting for the 7900 GTXs to hit the $700 mark.That's when I'll pull the trigger
Maybe they will when the fifty series and NVIDIA comes out. I just need good raster performance at four k with max settings for games like Cyberpunk. I'm mostly playing indie games these days anyways since triple A studios just can't do it right.
 
Then you want Nvidia ;). Raytracing is huge in that game. Dlss helps it run with virtually no IQ loss too, unlike fsr.

They did around 20ish years ago... or a bit less. :D
I'm fine without ray tracing

Nvidis is trying to set the new normal where you pay one thousand or more for a video card and I am not down with that.

My bad, 7900xtx
 
I'm fine without ray tracing
It is possible that max setting more and more will just enable RT without being a different option (think Avatar or some unreal 5 game that does not even have pre-baked light to use) and it is a bit already strange to call RT off has playing with Max setting in a game like Cyberpunk
 
It is possible that max setting more and more will just enable RT without being a different option (think Avatar or some unreal 5 game that does not even have pre-baked light to use) and it is a bit already strange to call RT off has playing with Max setting in a game like Cyberpunk
I'm a little confused.Are you saying that you're not really playing on max settings if you're not using ray tracing? I guess I could see that point of view, having never used it myself. I'm just not much of a gamer these days in terms of needing to be On the cutting edge of advanced visuals. I'm mostly a casual infrequent gamer , who likes older and more modest indy games and A and AA titles But I have been wanting to do a proper play through in cyberpunk since now they've finally finished it And polished it up. And it would be nice to play at high settings at 4K native resolution on my monitor with good responsive frames per second. But most of the other demanding triple A game titles these days really seem like trash. I think metro exodus is probably out the only other graphically demanding title I am interested in. If star Citizen ever gets finished I'd buy that too but I don't buy beta. Maybe the latest MS flight sim For some casual site seeing.
 
I'm a little confused.Are you saying that you're not really playing on max settings if you're not using ray tracing?
In some game the question does not really apply, Avatar for example if you set the game at ultra settings will have RT used, it is not a separate setting, that why at max setting the 4070ti beat card like a 7900xtx, it could be because it is an AMD sponsered title and they asked them to hide the RT in the menu and marketing or could be a sign for what to come:

performance-1920-1080.png


Use Software mode on older card and look different on them. There is no Max setting but no RT on type of option.

Some other game has not the option for RT to be off at all at any setting, if you use some tricks to close RT, it look like this:

View: https://www.youtube.com/watch?v=6PebQc3WFWk

Has in general RT is a graphical setting if it is not set to the max, obviously you are not in some way playing at max setting.

If:
I'm just not much of a gamer these days in terms of needing to be On the cutting edge of advanced visuals.
Trying to play relatively recent title like Cyberpunk at 4K max setting is maybe not the way to describe your need, things move quite slow these days, Cyberpunk at maxuium setting at 4k getting 70fps say, is quite hard, even with RT off, you probably need a 4090 for that:

performance-3840-2160.png


Upscaling got quite good and could be something to try for those really hard to run games a bit like RT, going from high to max/ultra can be quite the performance cost for little visual gain.
 
Last edited:
Back
Top