Blade-Runner
Supreme [H]ardness
- Joined
- Feb 25, 2013
- Messages
- 4,410
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Given the rumours that 3nm gaming GPUs are expected only in 2025, & also insufficient vram for all cards below 4090, it makes sense for nvidia to refresh the entire stack next year
Hopefully Nvidia doesn't go full retard on pricing again.4080 Super has the potential to be really good depending on the price...right now the only 4000 series card worth buying on the high end is the 4090
Hopefully Nvidia doesn't go full retard on pricing again.
Sure, if taking 3% market share in total shipments from NVIDIA last quarter is considered "killing" them.yeah Nvidia really needs to fix their VRAM issue...AMD is killing them in that aspect
Sure, if taking 3% market share in total shipments from NVIDIA last quarter is considered "killing" them.
They used that branding once and now every rumor since Turing is how they are doing a "Super refresh".Didn't we go through this song and dance with Ampere? I doubt NVIDIA will be bringing back the "SUPER" branding.
Yep, GDDR6X models in particular. 4070 and up.
Nvidia didn't leave themselves as much room this time, to simply add "ti".Didn't we go through this song and dance with Ampere? I doubt NVIDIA will be bringing back the "SUPER" branding.
4080 Super has the potential to be really good depending on the price...right now the only 4000 series card worth buying on the high end is the 4090
4080 is a great card, at a bad price. If they were 800, they would be great. IMO, the 4070 ti shouldn't even exist right now. It should be 4070 and 4080.4080 Super has the potential to be really good depending on the price...right now the only 4000 series card worth buying on the high end is the 4090
Well if the 4080 Super uses an AD102… they could do quite well in closing the gap on the 4090.Nvidia didn't leave themselves as much room this time, to simply add "ti".
I'm not sure why people keep trying to bring up the history with the different chips. Its clear that AMD and Nvidia both, are bucking that. You are paying for the end performance and the naming is based on that. Not which chip is under the heatsink.Well if the 4080 Super uses an AD102…
yup, it will still be to just buy a 4090 or nothing at all.
I'll just wait for the $2500 5090 at this point.
I don't know man, spending $1600 to play Cyberpunk with raytracing seems worth it to me (sarcasm).What you spend your money on is something I would respect. That said, I do not see a single game or program out there, unless you are making money from it, that justifies or even means that a 4090 at $1600 is worth it, at all. Games, back in the 2000's and maybe up to 2018 were often impressive and unique but, not really so much anymore.
The comment is made because the 4080 already use almost all the AD103 chip, going from 95% of the core to 100% would not be much, memory system change aside a 4080 super would not bring much.I'm not sure why people keep trying to bring up the history with the different chips. Its clear that AMD and Nvidia both, are bucking that. You are paying for the end performance and the naming is based on that. Not which chip is under the heatsink.
Pfff make money with it? It will just sit unused for months at a time like my 3080ti.If you could make money from it, I would say it is good but, if not.......
Pfff make money with it? It will just sit unused for months at a time like my 3080ti.
Well the 3070 ti did a bunch of stuff on paper to try and justify itself and-----that resulted in single digit gains over a 3070.The comment is made because the 4080 already use almost all the AD103 chip, going from 95% of the core to 100% would not be much, memory system change aside a 4080 super would not bring much.
Would a 15% faster card would be possible without doign a new chip or going to the worst bin reserve of the AD102 that poster would not mind it.
Considering the product stack, they could use the super just as a excuse to reduce the pricing where they want it to be without an official massive price cut announcement with no apparent reason.
The 4070TI already use 100% of the AD104, not much place to go, bad AD103 that cannot be ship in laptop anymore.. maybe
could gddr7 being ready "save them", easy bump without much work
They arguably already have pretty much all they need, just have to adjust name and price has needed.
Yeah could see unlike the 3070ti that used the new SKU name excuse to rise price, having a new product just to have a new product even if they are almost irrelevant hardware wise just to change price (but down) without loosing face.Point being, I could see Nvidia shafting the 4070 ti super in a similar way, if there even is one.
Based on current rumours, unlikely to be $800 cards in 2023 by AMD, I guess. 2025 is more like it.If the $800 8900xt beat by a good enough (or a $550 8800xt that get too close) they could need to shake stuff up, maybe Turing super generation level.
What about crypto mining? Is that still a thing?That said, I do not see a single game or program out there, unless you are making money from it, that justifies or even means that a 4090 at $1600 is worth it, at all.
The 4090 on the 102 and the 4080 on the 103 have a huge gap between them. But the 4080 doesn’t leave a lot on the table for growth, changing it over to a 102 they could easily land that at a half way point between the 4080 and the 4090 and call that a day.I'm not sure why people keep trying to bring up the history with the different chips. Its clear that AMD and Nvidia both, are bucking that. You are paying for the end performance and the naming is based on that. Not which chip is under the heatsink.
but it would kind of suck for me if they released a 4090 Super/Ti shortly after, because then my own purchase, which I just finally committed to, would lose some value. Knowing my luck, it'll happen though.
Interesting. Price creep for RTX 4090
An examination of current RTX 4090 pricing reveals that only the more costly GPUs are available. In addition, since they launched, the only card always at MSRP—Nvidia's Founders Edition—has almost disappeared. This GPU was a Best Buy exclusive, and we didn't even see it come up in a search for "RTX 4090." Once we narrowed the search to "Founders Edition," it appeared and was out of stock. According to Tom's Hardware's analysis, this is a new phenomenon for the card, indicating something is going on with the supply of these GPUs.
Sure, you can still buy an RTX 4090, but if you want one at MSRP, your only option right now appears to be the PNY version at Best Buy. Over on Amazon and Newegg, you won't find an MSRP card, which deviates from precedent. The least expensive cards are one from Galax for $1,654 on Amazon or a Gigabyte card on Newegg for $1,649. That means the smallest premium you'll pay is $50 over MSRP, but most available GPUs cost more than that, so you're looking at paying an additional $100 to $400 to own the world's most powerful GPU.
This situation seems to result from what we reported in August: that Nvidia was shifting production of its biggest chips to its AI accelerators, where margins are much higher. The same report said retailers were also finding the supply of RTX 4090s severely constrained, which aligns with what we're seeing now. Restricting the supply of the RTX 4090 is also a way to ensure prices remain high while helping the company divert precious TSMC resources to its AI chips. The true source of this price creep remains elusive. But it also seems certain that the lack of competition isn't helping.
https://www.extremetech.com/gaming/nvidia-rtx-4090-prices-are-going-up
Even just looking at the stack for 40-series (chip tiering, etc.), as well as its performance characteristics against previous gen, it's clear that the 4070 Ti should just be called the 4070 as it matches the previous gen flagship 3090 Ti, which is what standard 70-cards have done for gens now. But they got greedy trying to make it a "4080 12GB".4080 is a great card, at a bad price. If they were 800, they would be great. IMO, the 4070 ti shouldn't even exist right now. It should be 4070 and 4080.