NVIDIA rumored to be preparing GeForce RTX 4080/4070 SUPER cards

It can't cost more than a 7900xtx if it wants to compete otherwise ... well fools will still part with thier money

If people would just stop paying 2000 for gpus we could see 4090s for 599 again but society will do what makes the least sense.
I don't trust these prices - it's in USD - and when I do the conversions for my country (Canada/CAD), they're less than what the current series (4070 Ti, 4080) go for.
The cheapest 4070 Ti is $1100 ($825 USD) and the 4080 is $1600 ($1200 USD). We get a 'Canuck tax' - I suspect those in Australia and Europe (EU/European countries) do, also. It's an extra added amount when you compare to US market prices.
Look at the 4080 price!
 
I don't trust these prices - it's in USD - and when I do the conversions for my country (Canada/CAD), they're less than what the current series (4070 Ti, 4080) go for.
The cheapest 4070 Ti is $1100 ($825 USD) and the 4080 is $1600 ($1200 USD). We get a 'Canuck tax' - I suspect those in Australia and Europe (EU/European countries) do, also. It's an extra added amount when you compare to US market prices.
Look at the 4080 price!
I believe these prices would happen, atleast in US

Biden ban on 4090 would impact topline. Super sales need to replace that, I reckon.
 
I don't trust these prices - it's in USD - and when I do the conversions for my country (Canada/CAD), they're less than what the current series (4070 Ti, 4080) go for.
The cheapest 4070 Ti is $1100 ($825 USD) and the 4080 is $1600 ($1200 USD). We get a 'Canuck tax' - I suspect those in Australia and Europe (EU/European countries) do, also. It's an extra added amount when you compare to US market prices.
Look at the 4080 price!
And you shouldn't trust those prices anyway. Nvidia is notorious for changing prices right before launch. Retailers have to be on their toes.
 
leaked benchmarks have the 4070 Super being close in performance to the 4070 Ti (which is not surprising)...the major problem with the 4070 Super is the 12GB VRAM
 
16gb on the 4080 Super seems a little low tbh.
The next option is 32 is they clamshell the memory like they did for the 4060 16GB.

But I don’t think Nvidia would grace us with that sort of memory, it would be too good at doing the job of much more expensive cards.
 
The next option is 32 is they clamshell the memory like they did for the 4060 16GB.

But I don’t think Nvidia would grace us with that sort of memory, it would be too good at doing the job of much more expensive cards.
They can't do 24 like the 4090?
 
The cutdown 4090D option for China, to bypass restrictions, makes any 4080 Ti, 20gb, 320bit bus card very unlikely. Nvidia can just make more money selling special lower tier 102 die version to China it appears. Obviously they were building/binning these 102 dies for something.
 
If I had to wager, there was originally going to be a 4090 super, and the 4080 super would have used those cut down China edition ad102s. Instead all that stock goes to China, so the 4080 super has to use perfect ad103 cores, and they just keep selling 4090s as is.
 
new video from that Moore's Law is Dead YouTuber...I have no idea if he's reliable but his latest video has a lot of details...seems like the 4080 Super will be around 40% of the 4090's performance (and cost 20% less than the vanilla 4080's MSRP) and the 4070 Ti Super will be closer to the original 4080:

4080 Super 16GB 256-bit
6-9% performance improvement over 4080
$999

4070 Ti Super 16GB 256-bit
14-22% performance improvement over 4070 Ti
$799-$849

4070 Super 12GB 192-bit
$599-$649


View: https://www.youtube.com/watch?v=W5K8GM2fNDM


looks like this guy was right on the money...leaked prices are exactly like he said on the lower end:

4080 Super $999
4070 Ti Super $799
4070 Super: $599

https://twitter.com/Zed__Wang/status/1743709825488609311
 
They just need to bring back the titan models and it will be history repeating itself.
*90/*90Ti is the new Titan. I know NV insisted that the 3090 was not a Titan, but it clearly was. The Titan cards were always in a weird position in that they kinda made the corresponding Quadro cards look bad and I doubt NV wants any more of that.

It can't cost more than a 7900xtx if it wants to compete otherwise ... well fools will still part with thier money

If people would just stop paying 2000 for gpus we could see 4090s for 599 again but society will do what makes the least sense.
I agree that GPU prices have gotten ridiculous but a $599 flagship card is never happening. Everything is more expensive now- tech, housing, food, vehicles, people's labor, list goes on. 1080Ti was $699 seven years ago, 980Ti was $649 nine years ago, 780Ti was $699 eleven years ago... Heck, the 8800 Ultra was $849 seventeen years ago. And wafer costs are through the roof so high-end GPUs cost more to produce than they ever have. Even if consumer attitudes drastically shift, I would be stunned to see a flagship NVidia GPU going for less than $1000. If the absolute most people were willing to pay for a GPU in the 2020's was $599, that card would be using at the most a <400mm^2 chip with a 256-bit bus.
 
90/*90Ti is the new Titan. I know NV insisted that the 3090 was not a Titan, but it clearly was. The Titan cards were always in a weird position in that they kinda made the corresponding Quadro cards look bad and I doubt NV wants any more of that.
The Titan name is dead to force education and non profit business over to the professional series GPUs. I put something with Gamerz RGB XXX BJ Edition in the title I have a half dozen auditors up my ass before year end.
IT departments working with local non technically literate oversight misappropriating funds… that never happens…. So you better bet they watch hard for it and they jump the second they sniff anything adjacent to it.
So by killing off the Titan branding which was easily seen as a prosumer/work branding and replacing it with a bunch of gamer BS it made government, education, and any business working with public funds or grants unable to purchase them because dealing with the heat they bring down isn’t worth the money it saved.
 
The Titan name is dead to force education and non profit business over to the professional series GPUs. I put something with Gamerz RGB XXX BJ Edition in the title I have a half dozen auditors up my ass before year end.
IT departments working with local non technically literate oversight misappropriating funds… that never happens…. So you better bet they watch hard for it and they jump the second they sniff anything adjacent to it.
So by killing off the Titan branding which was easily seen as a prosumer/work branding and replacing it with a bunch of gamer BS it made government, education, and any business working with public funds or grants unable to purchase them because dealing with the heat they bring down isn’t worth the money it saved.
That makes a lot of sense, I've heard stories from folks with gov / edu backgrounds how difficult it was to build their own workstations for research and production because anything with a slightly gamerey-sounding name would get flagged.

NV knows they can't stop 1099s and farms from using RTX gaming boards as compute accelerators (not without hurting their value) but the Titan branding was too dangerous for their segmentation of "beyond this point, you must use Pro/Enterprise cards"
 
If the absolute most people were willing to pay for a GPU in the 2020's was $599, that card would be using at the most a <400mm^2 chip with a 256-bit bus.
This is rumoured to be the top RDNA 4 card, to be released in H2 of this year.

In 2 years time, AMD will be back to flogging big cards (which they hope to sell for $2000+)
 
This is rumoured to be the top RDNA 4 card, to be released in H2 of this year.

In 2 years time, AMD will be back to flogging big cards (which they hope to sell for $2000+)
Yeah I heard that too. AMD is probably waiting till their dual-GCD setup is realistic to productize (since it apparently didn't make the cut for RDNA3 despite early leaks/rumors)
 
Back on-topic...

RTX 4070S / 4070TiS / 4080S are officially announced (Videocardz)

TL;DR - the leaks were all correct regarding specs and price. 4070S and 4070TiS will slot in at same price as 4070 / 4070Ti, 4080S gets $200 price drop vs 4080

More VRAM on 4070TiS and price of 4080S make them interesting IMO. 4070S is kinda... meh?

I dunno - the 4070S slots into that no man's land between the RTX 4070 and the RTX 4070 Ti - all for the "old" price. Definitely has a nice slot, for sure - especially versus AMD offerings. $600 to $800 is a big jump in that price segment.

I can't help but feel this is how they should have launched everything in the first place. $1000 to $1600 feels "right" for the 4080S to 4090. I think we all had pain RE: $1200 to $1600 for the difference.
 
I can't help but feel this is how they should have launched everything in the first place.

It is Turing all over again.

I agree that GPU prices have gotten ridiculous but a $599 flagship card is never happening. Everything is more expensive now- tech, housing, food, vehicles, people's labor, list goes on. 1080Ti was $699 seven years ago, 980Ti was $649 nine years ago, 780Ti was $699 eleven years ago... Heck, the 8800 Ultra was $849 seventeen years ago. And wafer costs are through the roof so high-end GPUs cost more to produce than they ever have. Even if consumer attitudes drastically shift, I would be stunned to see a flagship NVidia GPU going for less than $1000. If the absolute most people were willing to pay for a GPU in the 2020's was $599, that card would be using at the most a <400mm^2 chip with a 256-bit bus.

I bought an i7 12700k this summer for $260. 4TB ssd's were down to $150 or so this past holiday season. For most of the past year parts have been their lowest in 3 years....except gpus. Well, besides the 2+ year old excess they want to sell off and even then its been a slow drop.
 
It can't cost more than a 7900xtx if it wants to compete otherwise ... well fools will still part with thier money

It actually can. Nvidia has over 80% market share because enough people will never consider an AMD GPU. They won't even look at it. Nvidia sets the price where they want it as a result, which everyone complains about it and asks "why isn't there more competition?". The evidence is in how many people bought a 4070 Ti and regurgitated Nvidia marketing lines about how 12GB is "totally enough" because "1440P or whatever", despite performance tanking in scenarios where a little more VRAM would have made it otherwise playable.
 
It actually can. Nvidia has over 80% market share because enough people will never consider an AMD GPU. They won't even look at it. Nvidia sets the price where they want it as a result, which everyone complains about it and asks "why isn't there more competition?". The evidence is in how many people bought a 4070 Ti and regurgitated Nvidia marketing lines about how 12GB is "totally enough" because "1440P or whatever", despite performance tanking in scenarios where a little more VRAM would have made it otherwise playable.
That's why I placed a safety disclaimer haha

"Fools will still part with thier money "
 
It actually can. Nvidia has over 80% market share because enough people will never consider an AMD GPU. They won't even look at it. Nvidia sets the price where they want it as a result, which everyone complains about it and asks "why isn't there more competition?". The evidence is in how many people bought a 4070 Ti and regurgitated Nvidia marketing lines about how 12GB is "totally enough" because "1440P or whatever", despite performance tanking in scenarios where a little more VRAM would have made it otherwise playable.
The memory thing is overblown. Is it good to have moAr memory? Sure. You could say the same for the "other" side with AMD folks talking about how their more memory is the best. Sure...it's better to have more memory but for me, personally, I'd rather have a more mature driver set, toolset, CUDA, G-SYNC, etc. (not even going to mention RT because it's a non-factor for me, for the most part).
 
Last edited:
Yeah I heard that too. AMD is probably waiting till their dual-GCD setup is realistic to productize (since it apparently didn't make the cut for RDNA3 despite early leaks/rumors)
It’s more due to limitations on TSMC’s advanced packaging capabilities. TSMCs demand on the advanced packaging and assembly nodes increased far faster than their ability to procure the hardware to keep up with it. So not only did they jack the prices but they shortened the windows too, so consumer viability of those nodes is impacted if they want to keep their higher margins. And with consumer spending expected to fall off a cliff I’d bet AMD and Nvidia are both looking to tighten things up in advance.
Hence the consumer delay of Hopper (which heavily relies on TSMC advanced packaging) and AMDs scaling back on the higher end consumer GPU segment.
 
*90/*90Ti is the new Titan. I know NV insisted that the 3090 was not a Titan, but it clearly was. The Titan cards were always in a weird position in that they kinda made the corresponding Quadro cards look bad and I doubt NV wants any more of that.
Yes and no. The 90/90 Ti is clearly also the rebrand of the 80 Ti tier and what it used to be with a price hike, but also sort of a hybrid between that and what used to be Titan with the 24GB. That said, historically, Titans had functionality that was gimped on GeForce cards and their own drivers to enable those workloads. That might not be the case anymore I'm not sure, but at release there were definitely things the RTX Titan still outperformed the 3090 on because of that. To just blanket say 90's are Titans....I don't know.

What I do know is that historically, the 80 Ti cards were the top end GeForce cards of a given gen, and that simply isn't true anymore since the 30-series, and the 40-series has yet to even have one over a year later.
 
Yes and no. The 90/90 Ti is clearly also the rebrand of the 80 Ti tier and what it used to be with a price hike, but also sort of a hybrid between that and what used to be Titan with the 24GB. That said, historically, Titans had functionality that was gimped on GeForce cards and their own drivers to enable those workloads. That might not be the case anymore I'm not sure, but at release there were definitely things the RTX Titan still outperformed the 3090 on because of that. To just blanket say 90's are Titans....I don't know.

What I do know is that historically, the 80 Ti cards were the top end GeForce cards of a given gen, and that simply isn't true anymore since the 30-series, and the 40-series has yet to even have one over a year later.
If we see an 80 Ti that would only be if NVIDIA has decided to postpone the 5000-series until Q1 2025 (versus the usual Q4 2024). Now would be the time to make that type of move.
 
The Titan name is dead to force education and non profit business over to the professional series GPUs. I put something with Gamerz RGB XXX BJ Edition in the title I have a half dozen auditors up my ass before year end.
IT departments working with local non technically literate oversight misappropriating funds… that never happens…. So you better bet they watch hard for it and they jump the second they sniff anything adjacent to it.
So by killing off the Titan branding which was easily seen as a prosumer/work branding and replacing it with a bunch of gamer BS it made government, education, and any business working with public funds or grants unable to purchase them because dealing with the heat they bring down isn’t worth the money it saved.

I presume that's why Asus launched the "ProArt" series. Same card, no gamer naming marketing nonsense.
 
If we see an 80 Ti that would only be if NVIDIA has decided to postpone the 5000-series until Q1 2025 (versus the usual Q4 2024). Now would be the time to make that type of move.
Agreed...yet all they announced today with 80, was a 4080 Super which is just full AD103, so ~5% better than a 4080. Granted, with a price cut. Which is good since they were trying to sell it for 80 Ti prices.

Still the $999 doesn't feel good considering the 3080 was $699. But given AMD was content to just slot into the initial Nvidia pricing structure with the 7900 XTX at $999, guess Nvidia has no incentive to be any lower.
 
the 4080 Super is the price that the original 4080 should have launched at...$1200 was a crazy price...was a great performing card but not at $1200...if you're going to spend $1200 for a GPU why not just go for the much better 4090...$1000 should be the max for the high end GPU...the 4090 was more of a Titan level card so the $1400- $1600 is fine

the 4070 Ti Super is the one I'm looking at as an upgrade to my 3080...the 16GB VRAM is not ideal but since I'm still on 1440p I think it'll be fine...crazy that $799 seems like a good deal for the non-top card in the lineup
 
The memory thing is overblown. Is it good to have moAr memory? Sure. You could say the same for the "other" side with AMD folks talking about how their more memory is the best. Sure...it's better to have more memory but for me, personally, I'd rather have a more mature driver set, toolset, CUDA, G-SYNC, etc. (not even going to mention RT because it's a non-factor for me, for the most part).

This comment, likely unintentionally, exactly proves my point. The memory thing is demonstrably NOT overblown. You can check the benchmarks yourself. The 4070 Ti drops off more than it should at 4K in many titles vs the 7900 XT, which is it's nearest price competitor and remains playable at that resolution.. If you're gaming at 4K, unless you're using a 4090, you're likely not taking advantage of a lot of what Nvidia has to offer because things like RT will tank the frame rate, at least that's the decision I made when I decided to buy AMD this round. I didn't need 24GB of RAM, but I needed more than 12GB, so the 4070 Ti, which was the card I would have initially been shopping for, was immediately ruled out. I can tell you my personal experience with the drivers has been more or less the same as far as gaming is concerned, and going from Gsync to Freesync has provided an identical experience. CUDA is essentially irrelevant to me because I'm exclusively gaming with this card.

There is an easy case to be made for buying Nvidia over AMD because they have better technology and a better feature set. That case can, and does, fall apart at certain resolutions due to the lack of VRAM, particularly because features like RT are more VRAM intensive. At 1440P, the 4070 Ti is generally OK as of now. It won't have as long legs as it deserves going forward though. VRAM will matter more in a few years than RT, and that can make the difference depending on how long you want to keep your card.
 
Guess I'll hold off a little longer opening my velco ninja turtles wallet for the 4070s reviews, but seems that may be my horse. Well, unless an errant 4070ti drops into the low $500 range.
 
The memory thing is overblown. Is it good to have moAr memory? Sure. You could say the same for the "other" side with AMD folks talking about how their more memory is the best. Sure...it's better to have more memory but for me, personally, I'd rather have a more mature driver set, toolset, CUDA, G-SYNC, etc. (not even going to mention RT because it's a non-factor for me, for the most part).
It's not just the memory capacity, it's the bandwidth. 4070 Ti with a measly 192-bit bus definitely shows its deficiencies at 4k in terms of performance scaling, and as for capacity, as a 3080 Ti owner, I can tell you 12GB is becoming not enough for 4k RT in some titles. Increased L2 helps, sure, but it isn't the magic band-aid Nvidia (or AMD for that matter with their Infinity Cache) wants you to think.

Now an argument could therefore be made that well a 4070 Ti isn't for 4k, but then I'd remind you that this is in fact a card that costs north of $800, so that really should not be the case.

EDIT: Also, I think the mere existence of the 4070 Ti Super now also sporting the same AD103 die / 256-bit 16GB of the 4080 at that same $800 price point just goes to show that those who said "oh manufacturing costs went up, etc. etc.) to justify the insane cost increase last year when these initial 40-series cards released....well it just goes to show you that, yes, in-fact a 4080 could have been an $800 card, only $100 more than the 3080 MSRP. Then the 70-class tier wouldn't have been so screwed up.
 
Last edited:
It's not just the memory capacity, it's the bandwidth. 4070 Ti with a measly 192-bit bus definitely shows its deficiencies at 4k in terms of performance scaling, and as for capacity, as a 3080 Ti owner, I can tell you 12GB is becoming not enough for 4k RT in some titles. Increased L2 helps, sure, but it isn't the magic band-aid Nvidia (or AMD for that matter with their Infinity Cache) wants you to think.

Now an argument could therefore be made that well a 4070 Ti isn't for 4k, but then I'd remind you that this is in fact a card that costs north of $800, so that really should not be the case.
That - I can buy. But there are no games that I play that require 12GB. I know because even though I have RTX 4090 at home - I have an RTX 4060 in my laptop. It's overblown. Future-proofing is a fool's errand.
 
the 4080 Super is the price that the original 4080 should have launched at...$1200 was a crazy price...was a great performing card but not at $1200...if you're going to spend $1200 for a GPU why not just go for the much better 4090...$1000 should be the max for the high end GPU...the 4090 was more of a Titan level card so the $1400- $1600 is fine

the 4070 Ti Super is the one I'm looking at as an upgrade to my 3080...the 16GB VRAM is not ideal but since I'm still on 1440p I think it'll be fine...crazy that $799 seems like a good deal for the non-top card in the lineup

People lined up for the $1600 4090, which means as much as I want one on the cheap, it wasn’t mispriced. On the other hand, no one showed up for the 4080. $1000 for the Super is a logical move. I’ll be curious to see what AMD ends up doing with the 7900 XTX, because at $1000 for the 4080 Super, there is zero reason to buy one at the current price.
 
That - I can buy. But there are no games that I play that require 12GB. I know because even though I have RTX 4090 at home - I have an RTX 4060 in my laptop. It's overblown. Future-proofing is a fool's errand.
Overblown, I agree. And in tech there really is no such thing as future-proofing. There is though, the expectation that a card doesn't age too poorly, and I was extremely disappointed that I hit VRAM issues with one title less than 12 months after I bought the card. Last time I ever had VRAM issues was literally in 2012 with a 2GB GTX 680 and modded Skyrim.

Future-proofing isn't a thing, but gimping card longevity is and I definitely regret buying the 3080 Ti instead of just springing for a 3090 since I am at 4k now.
 
I’ll be curious to see what AMD ends up doing with the 7900 XTX, because at $1000 for the 4080 Super, there is zero reason to buy one at the current price.
GeForce RTX 40 "SUPER" refresh performance projections, now incl. ray-tracing performance
nVidia's new SUPER cards should achieve an average +30% higher performance/price ratio at ray-tracing as AMD's RX7000 cards.

https://3dcenter.org/news/news-des-67-januar-2024
 
People lined up for the $1600 4090, which means as much as I want one on the cheap, it wasn’t mispriced. On the other hand, no one showed up for the 4080. $1000 for the Super is a logical move. I’ll be curious to see what AMD ends up doing with the 7900 XTX, because at $1000 for the 4080 Super, there is zero reason to buy one at the current price.

no one bought the 4080 because the 4090 was a significant upgrade...the people that can afford to pay $1200 for a GPU can also afford to pay $1600...if the 4080 was priced at $800 it would have sold much better (probably better than the 4090)...the problem is that Nvidia wanted everyone to get the 4090 and purposely made every other 40 series card irrelevant
 
no one bought the 4080 because the 4090 was a significant upgrade...the people that can afford to pay $1200 for a GPU can also afford to pay $1600...if the 4080 was priced at $800 it would have sold much better (probably better than the 4090)...the problem is that Nvidia wanted everyone to get the 4090 and purposely made every other 40 series card irrelevant
Sad part is I look back when I did my build and the card shortages were happening and I had nothing, but snagged a AMD RX6800 for $900 + tax...but I still go deum! at the thought of paying $1k for a video card.....
 
This comment, likely unintentionally, exactly proves my point. The memory thing is demonstrably NOT overblown. You can check the benchmarks yourself. The 4070 Ti drops off more than it should at 4K in many titles vs the 7900 XT, which is it's nearest price competitor and remains playable at that resolution.. If you're gaming at 4K, unless you're using a 4090, you're likely not taking advantage of a lot of what Nvidia has to offer because things like RT will tank the frame rate, at least that's the decision I made when I decided to buy AMD this round. I didn't need 24GB of RAM, but I needed more than 12GB, so the 4070 Ti, which was the card I would have initially been shopping for, was immediately ruled out. I can tell you my personal experience with the drivers has been more or less the same as far as gaming is concerned, and going from Gsync to Freesync has provided an identical experience. CUDA is essentially irrelevant to me because I'm exclusively gaming with this card.

There is an easy case to be made for buying Nvidia over AMD because they have better technology and a better feature set. That case can, and does, fall apart at certain resolutions due to the lack of VRAM, particularly because features like RT are more VRAM intensive. At 1440P, the 4070 Ti is generally OK as of now. It won't have as long legs as it deserves going forward though. VRAM will matter more in a few years than RT, and that can make the difference depending on how long you want to keep your card.
The other side of that could very well be that the limited GPU capabilities of the existing 7000 series could similarly see performance fall off with future titles even if they are really good now because of the extra memory, trying to future-proof anything GPU-related right now is a hard ask because things are changing too fast.
 
Back
Top