NVIDIA rumored to be preparing GeForce RTX 4080/4070 SUPER cards

will the 4070Ti Super and 4080 Super even be available at launch and will retailers try and price gouge?...will there be limited availability?...what if I decide to wait a month?
 
will the 4070Ti Super and 4080 Super even be available at launch and will retailers try and price gouge?...will there be limited availability?...what if I decide to wait a month?
0_SByY_E3vzQI7LKHD.jpg
 
will the 4070Ti Super and 4080 Super even be available at launch and will retailers try and price gouge?...will there be limited availability?...what if I decide to wait a month?

TBD. It's incredibly hard to predict how things are going to go right now. The refresh might ignite excitement in the market or it might be a giant "meh. I doubt we'll see much gouging, unless supply is super constrained.
 
TBD. It's incredibly hard to predict how things are going to go right now. The refresh might ignite excitement in the market or it might be a giant "meh. I doubt we'll see much gouging, unless supply is super constrained.

I'm guessing it'll be a meh as most people interested in a 4000 series card already got one...Nvidia somewhat fixed their pricing but I don't see tons of people rushing to get the 4080 Super (those people probably already own a 4080 or 4090)...the 4070 Ti Super will be the one to keep an eye on with its upgraded VRAM, 256-bit bus etc (it's pretty much a 4080)
 
The 7900XTX is viable 4K Gaming that isn't over a grand. There have been sales of the card for around 7-800 bucks already. I have one, it's pretty damn good. Everything is fast and fluid and that's without running FSR.

yeah, I almost bit on the 7900XTX for $799 last week but I want good ray tracing performance as well. Frankly though I was surprised it was up for a couple hours at that price.
 
yeah, I almost bit on the 7900XTX for $799 last week but I want good ray tracing performance as well. Frankly though I was surprised it was up for a couple hours at that price.

if you care more about rasterization then go with AMD...if you care about RT then it's Nvidia
 
Sure, if you ignore that:
  1. TSMC 4nm was triple the cost of Samsung 8nm at the launch of the RTX 4080
  2. The value of the US dollar fell 14.5% between the release of the RTX 3080 and RTX 4080
Yeah of course 4nm TSMC is more expensive. Nvidia got major discounts on Samsung 8nm, functionally a less advanced node for the time they used it in. Don't recall them lowering prices any to the customer after they raised it with Turing which was on a half node of 16nm.

Source on what Nvidia is paying for wafers? Source on bad yields or anything contributing to said prices?

I am not saying the the 4080 should have cost the same as the 3080 and I am willing to say ok, maybe a bit more than $800 with those factors. But $1200 was a greed number.

Personally I don't think anyone here talking about it knows the numbers for sure, but we definitely know Nvidia's penchant for taking a mile if you give them an inch, so I have a hard time giving them the benefit of the doubt on this and saying yep its not greed its the other things especially when Jensen then comes on and says "yep its expensive because Moore's Law is dead" and then turns around and goes to a conference and says the exact opposite. You'll forgive me if I don't immediately buy into Nvidia's trying to justify higher prices then what they probably actually needed, and you'll forgive me if I especially call it into question on smaller AD103 dies. I can concede AD102 being as large as it is costing what it does, yet its odd it was only $100 more expensive than the 3090 MSRP then isn't it? So why is a much smaller and thus easier to yield and much higher quantity per wafer AD103 die suddenly a $500 premium over last gen.

Yeah doesn't smell right. If they can make a profit selling an AD103 based card at $799, then yeah, something absolutely doesn't smell right here.
 
Last edited:
Yeah of course 4nm TSMC is more expensive. Nvidia got major discounts on Samsung 8nm, functionally a less advanced node for the time they used it in. Don't recall them lowering prices any to the customer after they raised it with Turing which was on a half node of 16nm.

Source on what Nvidia is paying for wafers? Source on bad yields or anything contributing to said prices?

I am not saying the the 4080 should have cost the same as the 3080 and I am willing to say ok, maybe a bit more than $800 with those factors. But $1200 was a greed number.

Personally I don't think anyone here talking about it knows the numbers for sure, but we definitely know Nvidia's penchant for taking a mile if you give them an inch, so I have a hard time giving them the benefit of the doubt on this and saying yep its not greed its the other things especially when Jensen then comes on and says "yep its expensive because Moore's Law is dead" and then turns around and goes to a conference and says the exact opposite. You'll forgive me if I don't immediately buy into Nvidia's trying to justify higher prices then what they probably actually needed, and you'll forgive me if I especially call it into question on smaller AD103 dies. I can concede AD102 being as large as it is costing what it does, yet its odd it was only $100 more expensive than the 3090 MSRP then isn't it? So why is a much smaller and thus easier to yield and much higher quantity per wafer AD103 die suddenly a $500 premium over last gen.

Yeah doesn't smell right. If they can make a profit selling an AD103 based card at $799, then yeah, something absolutely doesn't smell right here.
Simplest explanation is that both Nvidia & AMD wanted to clear old stocks plus first usage premium at TSMC

Both the reasons are gone now. Navi 21, 22 are on their way out & samsung 4nm will put pressure on TSMC. Plus improved memory or alternatively cheaper contract prices on existing memory.

So this year should be the return to normal prices.
 
NVIDIA's RTX 5000 GPUs on track for Q4 2024 launch but don't expect a Lovelace performance leap

We shouldn't expect the same level of performance jump that Lovelace ushered in over and above Ampere...however, we might be looking at something slightly better than the uplift Ampere provided over Turing, with Moore's Law is Dead floating a very rough figure of around a 50% performance boost (a good deal more modest than the 80% to maybe even 100% generational performance increase that an RTX 4090 Ti would likely have boasted, had it ever been made - but that GPU just wasn't needed)...

https://www.tweaktown.com/news/9533...t-expect-lovelace-performance-leap/index.html
 
Yeah of course 4nm TSMC is more expensive. Nvidia got major discounts on Samsung 8nm, functionally a less advanced node for the time they used it in. Don't recall them lowering prices any to the customer after they raised it with Turing which was on a half node of 16nm.
It was much cheaper, but Nvidia needed 2-3x more of it because of the terrible yields, 40-60%, vs TSMC 80% and up.

Nvidia pays per wafer regardless of how much they actually get out of one.
 
if you care more about rasterization then go with AMD...if you care about RT then it's Nvidia

yup, so I'll just wait until next gen I think. I really only play Battlefield games and the occasional BG3. I'm really not suffering with my 3070 and there really aren't any games that tickle my fancy on the horizon. I'll probably take the funds and buy an OLED monitor later in the year, possibly Prime Day then when the 5 series launch I'll go there.
 
yup, so I'll just wait until next gen I think. I really only play Battlefield games and the occasional BG3. I'm really not suffering with my 3070 and there really aren't any games that tickle my fancy on the horizon. I'll probably take the funds and buy an OLED monitor later in the year, possibly Prime Day then when the 5 series launch I'll go there.
Sounds like a good plan. Also keep an eye out for people (stupidly) dumping their 4070, 4070 Ti, or 4080 for a new SUPER card...plus there have been great sales on 7900 XTX which is a very nice card for the price.
 
Kind of annoying that they are releasing them from the bottom up. I'm in the market for a card to replace a 1080ti, but have by far the least interest in the 4070 Super. Would much prefer to see how 4080 Super inventory shakes out before having to decide if I want to drop down to a 4070 ti S
 
  • Like
Reactions: pavel
like this
It was much cheaper, but Nvidia needed 2-3x more of it because of the terrible yields, 40-60%, vs TSMC 80% and up.

Nvidia pays per wafer regardless of how much they actually get out of one.
Didn't know yields were that poor at Samsung. That would explain those prices then.

Well....yeah, I am aware.
 
Agreed...yet all they announced today with 80, was a 4080 Super which is just full AD103, so ~5% better than a 4080. Granted, with a price cut. Which is good since they were trying to sell it for 80 Ti prices.

Still the $999 doesn't feel good considering the 3080 was $699. But given AMD was content to just slot into the initial Nvidia pricing structure with the 7900 XTX at $999, guess Nvidia has no incentive to be any lower.
I've read rumours before that they're colluding.... I think it's pretty obvious, they are...
 
I've read rumours before that they're colluding.... I think it's pretty obvious, they are...
Nah they both use TSMC, so their price per square mm is the same, they require the same sized PCB, the same cooling components, the same VRM’s, Capacitors, etc. They are made by the same AIB’s who demand the same margins, use the same shipping companies, the same printers, use the same distributors who all need the same margins and costs. They even have the same primary investors who make nearly identical demands of their demographically identical boards.
AMD and Nvidia’s manufacturing costs and margin requirements are nearly identical, the only thing that separates them is their R&D and support costs which Nvidia spends far more on.

If anything it’s Capacitor, VRM, Ram, Wafer manufactures, and shipping companies that are collaborating.

But as always whoever launches their card first dictates the price and it’s up to whoever comes second and third to price accordingly. Nvidia launches at $999 even if AMD’s identical performing equivalent was planned to launch at $599, they would launch at $999. Because one their investors would sue them if they launched it at $599 when their biggest competitor values it at $999. And two if AMD did launch it at $599 bots would snatch them all up and scalpers would sell them at $999 then the investors would sue AMD for under valuing their product counter to the market conditions.

Both AMD and Nvidia got in major trouble by their investors and their board for under valuing their cards during the pandemic. As far as they were concerned the fact bots and scalpers could flip those cards at 2x MSRP meant those cards should have had an MSRP 2x higher.

Now if AMD did counter n Nvidia’s $999 card with an identically performing card at $599 the only way they could dodge a lawsuit would be if they actually managed to supply the demand at that price, and not have stock or scalper issues. Then they could reasonably claim they were aiming for market disruption and trying to sell increase their market penetration. But with TSMC’s limitations that isn’t happening.
 
Last edited:
Still the $999 doesn't feel good considering the 3080 was $699

3080 was $699?...on paper it may have been but in reality did anyone actually get it for that price outside of the initial rush of Founders cards?...feels like a lifetime ago that GPU prices were actually reasonable...but...this isn't the 4080...it's the 4080 S-U-P-E-R
 
This is actually great marketing from NVidia. They sold FE series 4080 etc at inflated pricing being new gen. Then they drop prices on super series to get you in shortly before the next gen lol.
 
3080 was $699?...on paper it may have been but in reality did anyone actually get it for that price outside of the initial rush of Founders cards?...feels like a lifetime ago that GPU prices were actually reasonable...but...this isn't the 4080...it's the 4080 S-U-P-E-R
AIB cards are almost never MSRP/SEP since that is the base price. You could still find cards close to that price if you purchased from a retailer, not a scalper. Best Buy and Gamestop don't pull the "market adjustment" BS that Newegg does.
 
AIB cards are almost never MSRP/SEP since that is the base price. You could still find cards close to that price if you purchased from a retailer, not a scalper. Best Buy and Gamestop don't pull the "market adjustment" BS that Newegg does.
Bah....sounds like crap to me. Dunno where you are but Best Buy here has nvidia cards for a few seconds and they sell out....they never keep any stock and during the crypto craze, there were line ups and they sold out that way.
Yeah, the other guy is right - Nvidia is good at marketing but that's because dolts buy their stuff - I mean, every gen, every revision are overpriced - and they either gimp the cards or only administer subtle tweaks so you get something like 2-10% improvements or whatever it is - it's pretty much negligible for the price/performance....
If ppl could have some willpower and just not buy them right away - then Nvidia might even reduce the price at some point - they must have some min. sales/revenues for the higher tier and flagship cards, they're happy with and buyers always ensure their quota. To think I'm called an Nvidia fanboy on here sometimes. LOL!
 
Bah....sounds like crap to me. Dunno where you are but Best Buy here has nvidia cards for a few seconds and they sell out....they never keep any stock and during the crypto craze, there were line ups and they sold out that way.
Yeah, the other guy is right - Nvidia is good at marketing but that's because dolts buy their stuff - I mean, every gen, every revision are overpriced - and they either gimp the cards or only administer subtle tweaks so you get something like 2-10% improvements or whatever it is - it's pretty much negligible for the price/performance....
If ppl could have some willpower and just not buy them right away - then Nvidia might even reduce the price at some point - they must have some min. sales/revenues for the higher tier and flagship cards, they're happy with and buyers always ensure their quota. To think I'm called an Nvidia fanboy on here sometimes. LOL!
I get what you're saying but it is just not logical when you look at the market. This is not just a gamers only card. That is where AMD has stumbled - NVIDIA took the gamble with their design and they've become the "go to" card for home/prosumer/developer/etc. AI work. So it's not as simple as the guys here using restraint or the general market - because now the market has expanded therefore so has demand.

You could also say the same about AMD fans as their prices have not been taking advantage of their place in the market. They are trying to extract the most dollars as well. $1,000 for a 7900 XTX is not exactly a bargain versus NVIDIA.
 
  • Like
Reactions: pavel
like this
The business side of this isn't hard to understand.

If you're selling out of a product (including but not limited to Video Cards), make more. If you can't make more, raise your price. Keep making more and raising your prices until you still have stock - your product is priced appropriately when you have 1 item left in stock when the replacement item is available.

Given the demand for video cards, I'm honestly surprised that NVidia lowered prices at all.
 
I get what you're saying but it is just not logical when you look at the market. This is not just a gamers only card. That is where AMD has stumbled - NVIDIA took the gamble with their design and they've become the "go to" card for home/prosumer/developer/etc. AI work. So it's not as simple as the guys here using restraint or the general market - because now the market has expanded therefore so has demand.

You could also say the same about AMD fans as their prices have not been taking advantage of their place in the market. They are trying to extract the most dollars as well. $1,000 for a 7900 XTX is not exactly a bargain versus NVIDIA.

I love hearing about this AI at home work. What exactly are people doing with Nvidia video cards and AI at home? Considering that business have 1,000's of them to run any kind of useful AI program, I have doubts a single card is doing much.
 
I love hearing about this AI at home work. What exactly are people doing with Nvidia video cards and AI at home? Considering that business have 1,000's of them to run any kind of useful AI program, I have doubts a single card is doing much.
Ever hear of stable diffusion?
 
I love hearing about this AI at home work. What exactly are people doing with Nvidia video cards and AI at home? Considering that business have 1,000's of them to run any kind of useful AI program, I have doubts a single card is doing much.
Messing with LLM. AI is not data center only. AI is not some super hero power. It's working with modeling. You can do quite a lot with 16GB+ VRAM.
 
I love hearing about this AI at home work. What exactly are people doing with Nvidia video cards and AI at home? Considering that business have 1,000's of them to run any kind of useful AI program, I have doubts a single card is doing much.

It’s not just big businesses doing it. Running AI generation at home has become a big thing. There’s also people that work from home and need their own systems to do their jobs.

Beyond that, the majority of content creators that use editors hire people that do their work from personal systems. Most of those folks aren’t going to be buying Quadros, they’ll be using consumer grade components. Nvidia dominates the video and audio prosumer market.
 
3080 was $699?...on paper it may have been but in reality did anyone actually get it for that price outside of the initial rush of Founders cards?...feels like a lifetime ago that GPU prices were actually reasonable...but...this isn't the 4080...it's the 4080 S-U-P-E-R
Look I have said this before I'll say it again. I'm just talking MSRPs here. Yes I am well aware of what happened to the street pricing and the gouging by retailers.
 
I love hearing about this AI at home work. What exactly are people doing with Nvidia video cards and AI at home? Considering that business have 1,000's of them to run any kind of useful AI program, I have doubts a single card is doing much.
There is training workload and inference workload that can be different.

The google model that let you predict meteo anywhere on earth from internet meteo feed that beat simulator running on $200 millions supercomputer can run on a relatively modest laptop.

Training that model took 4 weeks unstop on a cluster, inference from it run on a single Google TPU v4.

Photoshop and other product that run AI are made to run on relatively recent and relatively good but regular hardware.

If you do something quite specific and have good training dataset you can end up with much more smaller models than the Gpt 3 type that try to be able to speak about litteraly anything in multiple language (and even those the 3.5 billion type of model than run on regular GPUs are getting good). The ML part of DLSS type application or background noise reduction being an example here, it can run on very standard hardware, it is a relatively precise limited workload.
 
Last edited:
There is training workload and inference workload that can be different.

The google model that let you predict meteo anywhere on earth from internet meteo feed that beat simulator running on $200 millions supercomputer can run on a relatively modest laptop.

Training that model took 4 weeks unstop of a cluster, inference from it run on a single Google TPU v4.

Photoshop and other product that run AI are made to run on relatively recent and relatively good but regular hardware.

If you do something quite specific and have good training dataset you can end up with much more smaller models than the Gpt 3 type that try to be able to speak about litteraly anything. The ML part of DLSS type application or background noise reduction being an example here, it can run on very standard hardware, it is a relatively precise limited workload.

Seems were just renaming things that were algorithms that use CUDA as AI now. I think were starting to stretch what the definition of AI is or at least in my mind. I just have my doubts it's really a selling feature as much as it get touted, if you have that need then of course but I think that demand is tiny, similar to people needing a threadripper cpu.
 
Seems were just renaming things that were algorithms that use CUDA as AI now. I think were starting to stretch what the definition of AI is or at least in my mind. I just have my doubts it's really a selling feature as much as it get touted, if you have that need then of course but I think that demand is tiny, similar to people needing a threadripper cpu.
Everything I mentioned a clear case of ML, that use non algorithms code for part of their created results and not at all just renaming human made algorithm.

It is a both ways, over time AI get restricted more and more to what computer have yet to do well, in the past a npc decision tree in a game would have been called AI without thinking, now people would have an reaction to call it AI specially a simple one, a computer playing chess was an sci-fi ai dream in the past, now some would not call a computer playing chess AI.

Soon voice recognition will not be called AI by some and after that word auto-completion (that made us freak out at first on google) will stop to be called AI by some.
 
Last edited:
Best Buy and Gamestop don't pull the "market adjustment" BS that Newegg does.
Newegg was once a great retailer, great prices, great selection, friendly RMA policies, etc. That was so then. Nowadays, I use Newegg mainly for IDing products and checking reviews and pricing. Then I buy from Amazon or Best Buy. B&H Photo is also a great store but they have a smaller selection than Newegg. I hated the "bundles" that Newegg was trying to foist on us when GPU prices were going crazier by the day.
 
Newegg was once a great retailer, great prices, great selection, friendly RMA policies, etc. That was so then. Nowadays, I use Newegg mainly for IDing products and checking reviews and pricing. Then I buy from Amazon or Best Buy. B&H Photo is also a great store but they have a smaller selection than Newegg. I hated the "bundles" that Newegg was trying to foist on us when GPU prices were going crazier by the day.
Newegg has been doing market adjustments to their prices for a long time. I recall during the Litecoin craze back in 2012-2013 that they were selling a R9 290X for $900 when I was in the market for a new video card.
 
Newegg has been mostly a pc case and occasional gpu store for me. I'm pretty sure its been 4-5 years since I bought a mobo, ram, storage or cpu from them.
 
instead of just getting a 4070 Ti Super I'm debating upgrading my entire rig (CPU/memory/motherboard)...or maybe I'll just keep my 3080 and upgrade the system around it...or the 3rd option is keep everything as is and get a new home theater display- the Sony A95L (QD-OLED)...decisions, decisions
 
instead of just getting a 4070 Ti Super I'm debating upgrading my entire rig (CPU/memory/motherboard)...or maybe I'll just keep my 3080 and upgrade the system around it...or the 3rd option is keep everything as is and get a new home theater display- the Sony A95L (QD-OLED)...decisions, decisions

If you can wait, then RDNA 4 / Battlemage should be able to match 4070 ti super for $150-$250 cheaper

Maybe just keep your existing 3080 as long as possible...
 
If you can wait, then RDNA 4 / Battlemage should be able to match 4070 ti super for $150-$250 cheaper

Maybe just keep your existing 3080 as long as possible...

by that time Nvidia's Blackwell should also be close to release...
 
Back
Top