GeForce RTX 5000 GPUs Rumored For A 2024 Release And Huge Performance Uplift

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,929
Nice

“So with all that in mind, what does it mean for the GeForce RTX 50 series? It's too early to know for sure, but according to YouTuber RedGamingTech, NVIDIA's next-gen lineup will debut next year and deliver a "HUGE" performance leap.

There's not a lot of meat to the rumor and a large portion of the 13-and-a-half-minute video is devoted to rambling about things that are not directly related to the GeForce RTX 50 series. But there are few nuggets to digest. One of them is NVIDIA's reference to "Hopper Next" during Huang's keynote at Computex.

According to RedGamingTech, Hopper Next is "basically Blackwell" and is "going to be bifurcated, which is not too surprising, we've seen it before of course from NVIDIA across different product segments." That will purportedly include GeForce RTX 50 parts for desktops and laptops, as well as high performance computing (HPC) SKUs.

There's also a mention of SK hynix's confirming that its HBM3E memory is on track to show up in the first half of 2024. However, we'd be surprised if NVIDIA employed HBM3E on its consumer GPUs.

Along the same topic, there's no mention of what the memory allotment will look like for future GPUs. NVIDIA has drawn some criticism for not outfitting its cards with more VRAM, which prompted AMD to go on the offensive. But that, along with everything else about the next-gen lineup, is a discussion for another day.”

Source: https://hothardware.com/news/geforce-rtx-5000-gpus-2024-release-2x-performance-uplift
 
I care less about the "performance leap" than the price for that performance.
What if that performance leap puts the price for performance of the RTX 5000 series in a class of its own? Like the performance margins are so high that the price per performance is profoundly diminished this time around?
 
I just got done playing Spiderman on my PSPro for the first time, one of those "holy shit, I own this!?" moments. Had the best 2 weeks of my life gaming, still amazed the PS4 Pro pulls off those visuals...smooth gameplay, fluid combat, great graphics at Faux-K checkboarded 4k upscaling, just overall impressed across the board. $250 console and a $9.99 game.

What games are you guys dropping $2000 on a GPU to play again? FireStrike?

The return you get on these $1200+ GPU's is just really sad at this point......I really think PC gaming has lost its way with these marginal gains at outrageous prices.......Cyberpunk 2077 path tracing is pretty, but that's not a $1500 investment to turn on the "even prettier lights" option......to me anyhow, YMMV.
 
Probably will end up being roughly two years after the RTX 4000s, which are already 8 months old. Not really news, that seems to be Nvidia's pacing. Two years between cycles. Considering the slow sales maybe they will take their sweet time.
 
I just got done playing Spiderman on my PSPro for the first time, one of those "holy shit, I own this!?" moments. Had the best 2 weeks of my life gaming, still amazed the PS4 Pro pulls off those visuals...smooth gameplay, fluid combat, great graphics at Faux-K checkboarded 4k upscaling, just overall impressed across the board. $250 console and a $9.99 game.

What games are you guys dropping $2000 on a GPU to play again? FireStrike?

The return you get on these $1200+ GPU's is just really sad at this point......I really think PC gaming has lost its way with these marginal gains at outrageous prices.......Cyberpunk 2077 path tracing is pretty, but that's not a $1500 investment to turn on the "even prettier lights" option......to me anyhow, YMMV.

For me it's the fun of researching, choosing and building my own PC and tweaking on it. As much as I love gaming, the building is half the fun. I agree tho that if you're not into building/tweaking/tuning then you're better off with a console especially at these prices.
 
I care less about the "performance leap" than the price for that performance.
Bingo, the performance leap should be a given, hell that's the only reason why they should release new cards is that it is "much faster/better" than the previous generation at that same level (looking at you 4060ti). But if they go full derpderp and offer the next generation at 70%+ the cost of the previous gen then getting a 100% uptick in performance (assuming it did) is absolutely irrelevant. I get that silicon is expensive, but if all you're doing is catering to the crowd with more money than sense then you might as well cut away from all consumer cards and stick with the AI bullshit for large centers.
 
Nice

“So with all that in mind, what does it mean for the GeForce RTX 50 series? It's too early to know for sure, but according to YouTuber RedGamingTech, NVIDIA's next-gen lineup will debut next year and deliver a "HUGE" performance leap.

There's not a lot of meat to the rumor and a large portion of the 13-and-a-half-minute video is devoted to rambling about things that are not directly related to the GeForce RTX 50 series. But there are few nuggets to digest. One of them is NVIDIA's reference to "Hopper Next" during Huang's keynote at Computex.

According to RedGamingTech, Hopper Next is "basically Blackwell" and is "going to be bifurcated, which is not too surprising, we've seen it before of course from NVIDIA across different product segments." That will purportedly include GeForce RTX 50 parts for desktops and laptops, as well as high performance computing (HPC) SKUs.

There's also a mention of SK hynix's confirming that its HBM3E memory is on track to show up in the first half of 2024. However, we'd be surprised if NVIDIA employed HBM3E on its consumer GPUs.

Along the same topic, there's no mention of what the memory allotment will look like for future GPUs. NVIDIA has drawn some criticism for not outfitting its cards with more VRAM, which prompted AMD to go on the offensive. But that, along with everything else about the next-gen lineup, is a discussion for another day.”

Source: https://hothardware.com/news/geforce-rtx-5000-gpus-2024-release-2x-performance-uplift

NVIDIA CEO, Jensen Huang, Says Next-Gen GPU Will Be Made By TSMC​

https://wccftech.com/nvidia-ceo-jensen-huang-says-next-gen-gpu-will-be-made-by-tsmc/
 
Wouldn't be surprised if Blackwell is data center/streaming service option. They have priced these things so high fewer and fewer can afford the initial outlay. So Nvidia locks you into revolving subscription where you own nothing, but the plebs can afford it on a month to month basis...
 
  • Like
Reactions: erek
like this
For me it's the fun of researching, choosing and building my own PC and tweaking on it. As much as I love gaming, the building is half the fun. I agree tho that if you're not into building/tweaking/tuning then you're better off with a console especially at these prices.
Yeah, building the box, the new goodies, that's always fun......it just seems like we are living in a world where the hardware manufacturers want to be on a 2 year uptick cycle but software development is more on 4-5.....if they release the 5000 series with 'some improvement' over the 4000 series but they cut the price dramatically (not $50).....I'll bite. I just fear the UE5 games that are going to ship any decade now (cough) are going to just cripple these modern cards. Dunno, we'll all find out together I guess....
 
  • Like
Reactions: erek
like this
I want performance. I don’t care about price but I’m rich and make a shit load of money. I will also be getting an extra 4090 and build a render box. I wonder what used 4090 will go for?
 
Well, actually, I just wanna see jacketman in some new styles of jackets, something moar up to date/modern, with moar collars and longer sleeves, bigger buttons & zippers, and most important, uber-mega-supremely expensive, you know, to match his new GPU's, hehehe :)
 
  • Like
Reactions: erek
like this
Well, actually, I just wanna see jacketman in some new styles of jackets, something moar up to date/modern, with moar collars and longer sleeves, bigger buttons & zippers, and most important, uber-mega-supremely expensive, you know, to match his new GPU's, hehehe :)

He starts piloting a giant green mecha

Edit: And he doesn't like to rehearse as we all know, so it goes very, very wrong. But he keeps doing it anyway 👍
 
  • Like
Reactions: erek
like this
I just got done playing Spiderman on my PSPro for the first time, one of those "holy shit, I own this!?" moments. Had the best 2 weeks of my life gaming, still amazed the PS4 Pro pulls off those visuals...smooth gameplay, fluid combat, great graphics at Faux-K checkboarded 4k upscaling, just overall impressed across the board. $250 console and a $9.99 game.

What games are you guys dropping $2000 on a GPU to play again? FireStrike?

The return you get on these $1200+ GPU's is just really sad at this point......I really think PC gaming has lost its way with these marginal gains at outrageous prices.......Cyberpunk 2077 path tracing is pretty, but that's not a $1500 investment to turn on the "even prettier lights" option......to me anyhow, YMMV.
The $1500 I spent on my 3090 was a legitimate investment because I made it back crypto mining. Of course I also got to enjoy gaming on it on my 4k 120hz OLED with gsync, previous cards couldn't do that because they didn't have HDMI 2.1.

The $1500 I spent on my 4090 was an even better investment because it gave me bragging rights in this forum.

But yeah, seriously not worth buying the highest end for most people. I'm rich enough it doesn't matter to me.
 
I have a very large case, and the 4090 is already pushing it heavily. Unless Nvidia is willing to create an external box that connects to the PCIe slot through the back of my PC, I think we're at the limit of how large we can make a GPU. I sort of equate this to Smartphone size; smartphones kept getting larger and larger as people demanded more and more. Eventually, people maxed out on size and started saying "That's too big". I think we're at that limit with the current 4090.
 
It makes no sense to release anything as long as nVidia isn’t selling cards. Frankly they should just get out of the gaming market if they can’t release cards that offer the performance AND the price people want. Because they’re way down on market share and AMD is picking it up with RDNA2.
 
I don't think we see big new architectures next year. May be 2025. Demand is not as great as it was last gen due to crypto boom and given how they are not running out I think you will see them slow the release this time. We might see Ti series come and last us through 2024 and then new architecture in 2025.
 
  • Like
Reactions: erek
like this
Personally I think they will since AI is taking off and they can sell the 5000 series chips for a good chunk. So far the ONLY good 4000 series card IMO is the 4090.
I disagree. The 4090 is $100 more for a 75 percent uplift the, while 4080 is the same price as the 3080ti for a 40 percent or more uplift and 4gb more vram. The 4070ti is $100 more than the 3080 msrp for a 25 to 30 percent uplift and 2gb more vram. Those are solid gains for the pricing. Plus you gain DLSS3 frame gen and faster raytracing, serious features for high end gaming nowadays. EDIT: Also, to top it off they're actually available at msrp instead of much higher like the 3xxx series was.
 
Last edited:
I disagree. The 4090 is $100 more for a 75 percent uplift the, while 4080 is the same price as the 3080ti for a 40 percent or more uplift and 4gb more vram. The 4070ti is $100 more than the 3080 msrp for a 25 to 30 percent uplift and 2gb more vram. Those are solid gains for the pricing. Plus you gain DLSS3 frame gen and faster raytracing, serious features for high end gaming nowadays. EDIT: Also, to top it off they're actually available at msrp instead of much higher like the 3xxx series was.
Mathematically it makes sense in my head... while I know many will disagree, it does to me and here's my logic:

The 2070S to the 3070Ti saw a price increase of $100 $499-$599 and saw a 41% increase in performance with not much else to talk about since no new features were added for the 30-series.

The 3070Ti to 4070Ti saw yet another $200 price increase, $100 increase over the original price bump from the 2070S to 3070Ti, but in exchange you got roughly a 50% increase in performance, as well as Shader Execution Reorder, Opacity Micromaps, and DLSS 3.0 which includes Frame Generation and Reflex. Then there's also the additional 4GB of VRAM, and lower power consumption.

But take into account manufacturing costs went up, the most important part of the GPU, the Silicon, practically doubled in price. So, let's say the additional 4GB of VRAM and price hike in silicon costs account for half of the $200 price increase over the 3070Ti, the other half would be the superior performance, and the aforementioned features.

While I won't go around saying $800 is a great deal, it's not, but at least Nvidia isn't pulling an AMD and just giving you linear gains and better power consumption while slapping gobs of VRAM on a GPU that would shit the bed if put in a situation where that amount of VRAM actually got used for something graphically intensive, you're actually getting nice new features that can, and most likely will, extend the 40-series life-span into the 60-series releases at least. Not to dog the 7900XT or XTX, but no new features, nothing groundbreaking, just linear gains over their predecessor, and copycat of Nvidia's Frame Gen with no concrete release date, with cheaper manufacturing considering AMD's chiplets, IIRC, use a hybrid 5nm-6nm process while commanding, at the time of release, a $900 and $1000 premium.

But to stay on topic--2024, IMO, sounds about right for maybe a 5090 considering the 3090 and 4090 released 2 years 1 month apart from one another I'd speculate the 5090 might release October/November 2024 if Nvidia manages to secure enough silicon. I'd imagine the rest of the lineup probably release within the Q1 of 2025 if they don't run into silicon issues with 3nm. I'm honestly intrigued to see what the 50-series brings to the table. I'd imagine Nvidia probably heard the crying over VRAM and will probably rectify it with their next releases, at least one would hope they did. Although, if the rumors of them using HBM have any truth to them, it probably won't come cheap, but probably nowhere near the prices I see people quoting, I'd imagine they're either going to be +/- $100 over their predecessor on everything not the 5090.
 
Are we still expecting a 4090 Ti to release, or will they go straight to the 5090? 🤔

100% we're getting a 4090TI card likely end Q3/Q4 this year, no way they're passing up on milking the fanboys for more money.. i highly doubt they're releasing 50 series any earlier than Q3 2024. it'll likely be a workstation/AI specific GPU series in the 2k-5k dollar prosumer market similar to what Volta was between the 10 and 20 series in Q1. there's too much money in the AI field to launch a GeForce GPU that can be used for AI when you can launch a AI specific GPU for twice the price.. when the hypes died down on AI midway of 2024 then they'll try to build it back up with the 50 series.

that's my prediction based on nvidia's launch history.
 
Launch history says we'll have 2 years of milking minimum, there's a whole host of workstation cards yet to be released on top of yet another dgx box. anyone have 12kW power handy for just 1 box? Stating a next gen release for next year is just rumor milking.
 
  • Like
Reactions: erek
like this
I disagree. The 4090 is $100 more for a 75 percent uplift the, while 4080 is the same price as the 3080ti for a 40 percent or more uplift and 4gb more vram. The 4070ti is $100 more than the 3080 msrp for a 25 to 30 percent uplift and 2gb more vram. Those are solid gains for the pricing. Plus you gain DLSS3 frame gen and faster raytracing, serious features for high end gaming nowadays. EDIT: Also, to top it off they're actually available at msrp instead of much higher like the 3xxx series was.
You aren’t wrong. The issue is nvidia selling you lesser chips for more. If they wanted it name it 4080ti with similar die sure. It’s just the fact they decided to call it 4080 and sell it for way more. We can only compare names and prices for gen to gen. It’s not really our fault nvidia decided to name it 4080 at 1200.
 
Launch history says we'll have 2 years of milking minimum, there's a whole host of workstation cards yet to be released on top of yet another dgx box. anyone have 12kW power handy for just 1 box? Stating a next gen release for next year is just rumor milking.

Maybe paper launch/announcement end of 2024 and then rollout over 2025
 
I disagree. The 4090 is $100 more for a 75 percent uplift the, while 4080 is the same price as the 3080ti for a 40 percent or more uplift and 4gb more vram. The 4070ti is $100 more than the 3080 msrp for a 25 to 30 percent uplift and 2gb more vram. Those are solid gains for the pricing. Plus you gain DLSS3 frame gen and faster raytracing, serious features for high end gaming nowadays. EDIT: Also, to top it off they're actually available at msrp instead of much higher like the 3xxx series was.
You didn't mention the 4060 Ti. The 4090 isn't something most gamers would consider due to pricing. The 4080 as well since its $1,200 which is LOL. The 4070 Ti at $800 is laughable. DLSS3 for some of these graphic cards are required for Ray-Tracing on new titles. The only reason to mention Nvidia's graphic card naming is to generally piss off the consumers as a reminder how much they're trying to abuse their market position. The only thing that matters is price and if you're spending more to get more over a previous generation product then that isn't a good deal. You're spending more to get more which isn't how computers have been for decades.
7o7cd5.jpg

But take into account manufacturing costs went up, the most important part of the GPU, the Silicon, practically doubled in price. So, let's say the additional 4GB of VRAM and price hike in silicon costs account for half of the $200 price increase over the 3070Ti, the other half would be the superior performance, and the aforementioned features.
I like how you're justifying Nvidia's price increase when their competition doesn't have these problems. Firstly, Linus said that an extra 8GB of VRAM costs Nvidia only $15. The Intel A770 with 16GB of VRAM can be found for $340. AMD's 6800 with 16GB can be found for $500. So yea, stop defending Nvidia.
While I won't go around saying $800 is a great deal, it's not, but at least Nvidia isn't pulling an AMD and just giving you linear gains and better power consumption while slapping gobs of VRAM on a GPU that would shit the bed if put in a situation where that amount of VRAM actually got used for something graphically intensive,
I think we're looking at very different benchmarks.
you're actually getting nice new features that can, and most likely will, extend the 40-series life-span into the 60-series releases at least.
What new features? Please don't say DLSS because that shouldn't be needed when buying a new GPU. Older GPU's or sub $200 then sure, but not $300 or $400 and up graphic cards.
Not to dog the 7900XT or XTX, but no new features, nothing groundbreaking, just linear gains over their predecessor, and copycat of Nvidia's Frame Gen with no concrete release date, with cheaper manufacturing considering AMD's chiplets, IIRC, use a hybrid 5nm-6nm process while commanding, at the time of release, a $900 and $1000 premium.
What new features? You mean features like DLSS3 that will likely die off and fade away because Nvidia not only locked it to their graphic cards, but DLSS3 is only for RTX 40 series and up? You will likely see more games feature only FSR with restrictions like those. AMD's cheaper manufacturering is smarter, not actually cutting corners. If you know anything about silicon manufacturing, it's also beneficial for performance as well. The price I do agree with because AMD is flying to close to Nvidia's sun, and will likely fail unless they offer much cheaper prices. Not that Nvidia is having much success with any of their 40 series GPU's other than the 4090, if their earnings report is anything to go by.
I'm honestly intrigued to see what the 50-series brings to the table.
I'm not considering the shit show that is the 40 series.
I'd imagine Nvidia probably heard the crying over VRAM and will probably rectify it with their next releases, at least one would hope they did. Although, if the rumors of them using HBM have any truth to them, it probably won't come cheap, but probably nowhere near the prices I see people quoting, I'd imagine they're either going to be +/- $100 over their predecessor on everything not the 5090.
If Nvidia is looking to release the RTX 50 series sooner then that's because their 40 series aren't selling. This is similar to what happened after the crypto market crashed back in 2017 when Nvidia released the RTX 20 series and nobody bought them because the performance uplift wasn't there compared to the GTX 10 series. If Nvidia is going to release the RTX 50 series this year, then you know the AI thing isn't panning out.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
I disagree. The 4090 is $100 more for a 75 percent uplift the, while 4080 is the same price as the 3080ti for a 40 percent or more uplift and 4gb more vram. The 4070ti is $100 more than the 3080 msrp for a 25 to 30 percent uplift and 2gb more vram. Those are solid gains for the pricing. Plus you gain DLSS3 frame gen and faster raytracing, serious features for high end gaming nowadays. EDIT: Also, to top it off they're actually available at msrp instead of much higher like the 3xxx series was.
Raytracing is a bonus for sure, but if the video card is lacking the vram then its a moot point. Already having games at 1080/1440p running out memory for 12GB cards. DLSS3 is a gimmick and no one wants to have fake frames where the image quality sucks ass.

4080 - $200 over priced with only 16GB of memory which should be used for mid-tier.
4070ti - $200 over priced with only 12GB of memory which should be as a low-end amount
4070 - $200 over priced with only 12GB of memory which should be used as a low-end amount
4060ti - $100 over priced with only 8GB of memory which should be used for entry level cards.

Seems like all those cards are overpriced and most of them lack vram (4080 I guess is ok although 16GB is lacking but can get by). There is a reason Nvidia can't sell these cards.
 
Does this mean the 5060 Ti will perform the same as the 4060 Ti and 3060 Ti?

I think you're forgetting the most important features:

- It will support some new technology that isn't universally implemented in all games and has draw backs.

- The price will be $600. This will be a feature and a selling point. Most expensive 60 Ti the world has ever seen!

:ROFLMAO:
 
I just got done playing Spiderman on my PSPro for the first time, one of those "holy shit, I own this!?" moments. Had the best 2 weeks of my life gaming, still amazed the PS4 Pro pulls off those visuals...smooth gameplay, fluid combat, great graphics at Faux-K checkboarded 4k upscaling, just overall impressed across the board. $250 console and a $9.99 game.

What games are you guys dropping $2000 on a GPU to play again? FireStrike?

The return you get on these $1200+ GPU's is just really sad at this point......I really think PC gaming has lost its way with these marginal gains at outrageous prices.......Cyberpunk 2077 path tracing is pretty, but that's not a $1500 investment to turn on the "even prettier lights" option......to me anyhow, YMMV.
PC gamers have this mentality that you have to have top of the line...all settings set to ultra and all RTX features on.

When in reality most ultra settings are placebo and hurt performance for zero gain and RTX is broken in most games... Hogwarts and RE4 for example.
 
  • Like
Reactions: erek
like this
I think you're forgetting the most important features:

- It will support some new technology that isn't universally implemented in all games and has draw backs.

- The price will be $600. This will be a feature and a selling point. Most expensive 60 Ti the world has ever seen!

:ROFLMAO:

but.. i like playing my games at 720p with a 500 dollar gpu just so my frame rate counter says it's running above 100fps even though only 1/2 of that is actually rendered by the gpu.

/s ;)

agree Dion. if i'm doing things like screenshots or whatever then yeah i'll max everything out because that's the only way you're going to notice the difference otherwise i see no point in it. vast majority of the time running ultra settings in games they just over saturate or over sharpen everything so it feels fake as hell and ruins it for me.
 
Last edited:
Back
Top