NVIDIA to Re-introduce GeForce RTX 2060 and RTX 2060 SUPER GPUs

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,785
"The source also claims that the pricing structure of the old cards will be 300 EUR for RTX 2060 and 400 EUR for RTX 2060 SUPER in Europe. The latter pricing models directly competes with the supposed 399 EUR price tag of the upcoming GeForce RTX 3060 Ti model, which is based on the newer Ampere uArch instead of the last-gen Turing cards. The possibility for such a move is a possible scarce of GA106/GA104 silicon needed for the new cards, and the company could be aiming to try and satisfy the market with left-over stock from the previous generation cards."

1611153229753.png


https://www.techpowerup.com/277507/...rce-rtx-2060-and-rtx-2060-super-gpus#comments
 
Someone with more business acumen and who is more tech savvy than I am will be able to analyze this move.

Me? Reopening an "obsolete" product line reeks of being the precursor of large-scale economic problems. Supply chain, spending ability, market demand...something's broken.
 
TSMC might have some capacity on 12nm for Turing. Actually makes sense with everyone clamoring for their 5nm and 7nm lines.

But we also know substrates are tight, even in the auto industry.

This might be a way to relieve some pressure by opening a path of least resistance. If substrates start flowing, then I bet ramping up Turing is very easy. AMD is obviously much more constrained with everything on 7nm TSMC.
 
Someone with more business acumen and who is more tech savvy than I am will be able to analyze this move.

Me? Reopening an "obsolete" product line reeks of being the precursor of large-scale economic problems. Supply chain, spending ability, market demand...something's broken.
My money is on supply chain issues. Samsung can only churn out so many 8nm wafers, and NVIDIA is now trying to roll out the entire Ampere product stack top-to-bottom. Maybe they had to reallocate wafers to start churning out mobile GPUs? I'm guessing TSMC has extra 12nm fab capacity since everyone wants 7nm and 5nm for their hot new products these days. I'm guessing they want to extract every dollar from gamers before AMD has a chance to enter this market segment.

I don't think NVIDIA would go through the expense of taping out the 2060 over at Samsung, and I'm also not sure they could even do so legally. When you partner with a fab to tape out a design they grant you access to their SDK which contains trade secrets. Moving an existing design to another fab down the line might constitute a breach of that NDA, among other things. So I'm guessing these are going to come from TSMC just like Turing always has.
 
I'm not sure where the "left over GPU dies" theory came from, I've also seen it at Wccftech, but it's utter BS. If they had a big stash of turing dies left they'd've made more 20xx cards and sold them before releasing the 3060.

This is spinning up fresh production, probably as noted by others above, because TSMC has spare capacity at 12nm; and the 30cm wafers it uses are much more of a commodity than the 20cm ones whose shortage is apparently the cause of automakers pain on ultra-legacy processes.

With demand exceeding supply, and apparently not going to change anytime soon, this is a good thing; but I wish NVidia would've realized the crunch was coming far enough in advance to've never stopped 2060 production in the first place, and put the wafers that have gone into 3060's to more 3070+ production instead.
 
I paid $350 for my 2060 super quite some time ago and the market needs decent $200-300 GPUs yesterday. If they come out at original MSRP or more that sucks. Thing is the shortage is so prevalent that they will probably sell regardless.
 
I paid $350 for my 2060 super quite some time ago and the market needs decent $200-300 GPUs yesterday. If they come out at original MSRP or more that sucks. Thing is the shortage is so prevalent that they will probably sell regardless.
That's a serious understatement. Not only is the GTX 1060 still the #1 graphics card used on Steam, but it seems to be climbing up again. Hey guys with RTX 3000 series cards, you should pay attention to this. If the average gamer has what is the equivalent of a GTX 1060 then Ray-Tracing won't be added to many games. The RTX 2060 isn't a bad card but that thing has a hilarious history to it. When Nvidia released the 2060 Super to combat AMD's RX 5700's, the regular RTX 2060 started to sell like crazy because the priced dropped down to $300. They were suppose to phase it out but kept it going due to demand. Three years later Nvidia is still selling these things for basically the same price. Worst yet I know 2060's are selling at like $400, so the demand is already pretty high.

Isn't Nvidia a little embarrassed that their 5 year old GTX 1060 is still more popular than any RTX card they released so far? Where's AMD in all this? AMD's most popular card is the RX 580 and 570, which is also 5 years old. Forget the 5700's, they hardly exist on Steam.
 
That's a serious understatement. Not only is the GTX 1060 still the #1 graphics card used on Steam, but it seems to be climbing up again. Hey guys with RTX 3000 series cards, you should pay attention to this. If the average gamer has what is the equivalent of a GTX 1060 then Ray-Tracing won't be added to many games. The RTX 2060 isn't a bad card but that thing has a hilarious history to it. When Nvidia released the 2060 Super to combat AMD's RX 5700's, the regular RTX 2060 started to sell like crazy because the priced dropped down to $300. They were suppose to phase it out but kept it going due to demand. Three years later Nvidia is still selling these things for basically the same price. Worst yet I know 2060's are selling at like $400, so the demand is already pretty high.

Isn't Nvidia a little embarrassed that their 5 year old GTX 1060 is still more popular than any RTX card they released so far? Where's AMD in all this? AMD's most popular card is the RX 580 and 570, which is also 5 years old. Forget the 5700's, they hardly exist on Steam.

The 1060 jumping like is just a data artifact. A few years ago shared computers in Chinese gaming centers massively distorted Steam's stats for about half a year until Steam worked out a fix to stop them getting counted many times over. This month's HW survey shows a large jump (+8.75%) in Simplified Chinese (the version used in communist china) language users indicating that their de-dupe filter has sprung a leak again. Hopefully they'll fix it faster this time.
 
Seems like NVIDIA sees an opportunity to cash in on this demand and they have the supply to make a great card at a good margin for themselves, why not take advantage of it? People are clamoring to get their hands on some type of card and at least this will hold people over for some time being...
 
Seems like NVIDIA sees an opportunity to cash in on this demand and they have the supply to make a great card at a good margin for themselves, why not take advantage of it? People are clamoring to get their hands on some type of card and at least this will hold people over for some time being...
Pretty much. As it’s been said by others that the presence of more GPUs on the market period will help satisfy demand and bring prices back to the realm of reason across the board. That or everything is up 20-40% from here on in. Hopefully not.
 
IF they are on different supply chain, great I am not sure why (if?) they ever stopped making those actually, I guess restart furnishing more than the big pc builder ?

Better would be 5700-5700xt flooding the market
 
IF they are on different supply chain, great I am not sure why (if?) they ever stopped making those actually, I guess restart furnishing more than the big pc builder ?

Better would be 5700-5700xt flooding the market
They stopped making the old ones so you'd have to buy the next, more expensive, generation.
 
These vendors have done this many times in the past.

A GPU can be re-used in the next gen product lineup as a lower end part. Many times they rename it as well, but it's actually the exact same GPU. At least for this release they are keeping the name so you know what it compares to. The predicted prices seem a bit too high though. Knock $50 off and keep them cheaper than 30xx cards, and it makes sense.
 
I don't believe this for a second - by the time NVIDIA spins back-up production capacity (3 months) then actually builds the cards (3 months), the supply issue will likely have resolved themselves.

I don't expect miners to grab the 3060s, as you're paying a $50 premium for memory they won't use (why not grab the 3060 ti for $70 more?)

This new chip will also help ease demand for ampere, and we will eventually see cut cards around $250
 
Last edited:
They have the chips, and they can be produced on a node that isn't in high demand, for 1080p the 2060 series is still a good card, and if they can supply them in any degree of quantity at a reasonable market price then people will buy them.
 
They stopped making the old ones so you'd have to buy the next, more expensive, generation.
Source (outside the 2070 and up) ?

By the time of that dynamic would have occured for the 2060/1660 type of card the all you can make you can sell at ridiculous price was already quite installed.
 
They have the chips, and they can be produced on a node that isn't in high demand, for 1080p the 2060 series is still a good card, and if they can supply them in any degree of quantity at a reasonable market price then people will buy them.

Since giving my oldest my 2080 super-- I have been "slumming' on a 2060 super at 1440p and its fine. Besides the vast majority of gamers are spending 2-$400 on a GPU so its right in that sweet spot. Still would like to see a second run of these for less than original pricing.
 
Source (outside the 2070 and up) ?

By the time of that dynamic would have occured for the 2060/1660 type of card the all you can make you can sell at ridiculous price was already quite installed.
I don't have a source; it's just a bet based on nGreedia.
 
These vendors have done this many times in the past.

A GPU can be re-used in the next gen product lineup as a lower end part. Many times they rename it as well, but it's actually the exact same GPU. At least for this release they are keeping the name so you know what it compares to. The predicted prices seem a bit too high though. Knock $50 off and keep them cheaper than 30xx cards, and it makes sense.

Yeah, this is true. The GTX 770 was just a rebranded GTX 680 for instance. If TSMC has 12nm capacity available, honestly it makes sense...but only if priced lower than Ampere cards.
 
I would be fine if they rebranded it as a RTX3050/Super and priced at ~$200.

We need a consumer win somewhere.

1060s and rx580s are starting to get oooooold and slow.
The problem with a rebrand is that it would get in the way of Ampere based 3050 level products in the future. Assuming they have such planned, I'd rather see a bifurcated product line for a while than have to deal with confusion from the 3045 and 3055 being Ampere based and performing significantly differently than the Turing 3050 series.

The 770 was a different story since the 600 and 700 series were both Kepler at all the model numbers high enough to be relevant and the difference between GK10X and GK11X parts didn't really matter at the consumer level.
 
The problem with a rebrand is that it would get in the way of Ampere based 3050 level products in the future. Assuming they have such planned, I'd rather see a bifurcated product line for a while than have to deal with confusion from the 3045 and 3055 being Ampere based and performing significantly differently than the Turing 3050 series.

The 770 was a different story since the 600 and 700 series were both Kepler at all the model numbers high enough to be relevant and the difference between GK10X and GK11X parts didn't really matter at the consumer level.
Well by the time "Ampere based 3050 level products" come out, it'll be time for the 4000 series...
The problem you pointed out isn't much of a problem at all. Anything Ampere based under a 3060 will most likely be GTX anyways just like Turing's 1660, 1650, etc.
 
Not surprising. I'm guessing the 2060 line-up is because it's not really going to compete with current products, should they ever come back in stock. By the time prices come back down and stock levels normalize, they can reduce the 2060 prices to clear stock before introducing anything lower than the 3060.

I paid $330 for my 2060 just a couple of months after partner cards came out at the end of 2018.
Why is it every time I buy a mid-range GPU with the expectation to replace it in a year or two, I get screwed over by shortages when that time comes? The exact same thing happened when I bought an RX 480.
Well by the time "Ampere based 3050 level products" come out, it'll be time for the 4000 series...
The problem you pointed out isn't much of a problem at all. Anything Ampere based under a 3060 will most likely be GTX anyways just like Turing's 1660, 1650, etc.
Nvidia has stated all new cards this gen will be RTX. They realized only releasing the middle-to-top end of their card lineup in RTX made it seem like even they didn't have much faith in Ray Tracing. Although my 2060 can't really handle any sort of Ray-Tracing beyond turning it on just to watch a pretty slide-show, new cards seem to step it up a little bit. Still doubt the low-end will be able to use it very well, unless some serious optimization techniques get figured out. Either way, supposedly all new (desktop) cards will be RTX.
 
paid $330 for my 2060 just a couple of months after partner cards came out at the end of 2018.
Why is it every time I buy a mid-range GPU with the expectation to replace it in a year or two, I get screwed over by shortages when that time comes? The exact same thing happened when I bought an RX 480.

This loop here is why the "Ti" surfers can fare better. Just have to save up that thousand bux for the first card and roll it for 2 gens at least. Look how long the 1080ti held up? Now with that being said the 1080ti was maybe too good and Nvidia probably considers it a mistake in longevity. One they probably won't repeat. Either way if you go big GPU wise it tends to last but I'm in that 300-600 range these days myself which is way up from the 250 max buy rule I had years ago.
 
  • Like
Reactions: Nside
like this
This loop here is why the "Ti" surfers can fare better. Just have to save up that thousand bux for the first card and roll it for 2 gens at least. Look how long the 1080ti held up? Now with that being said the 1080ti was maybe too good and Nvidia probably considers it a mistake in longevity. One they probably won't repeat. Either way if you go big GPU wise it tends to last but I'm in that 300-600 range these days myself which is way up from the 250 max buy rule I had years ago.
Yeah, I got that 480 for $150 by stacking coupons at the now defunct Jet-dot-com. I really didn't want to spend $300+ on a new card. But I realized I had originally gotten a good deal on the 480, price/performance had stagnated from 10xx to 20xx, and I really wanted an upgrade so that I wasn't sometimes struggling/compromising for 60 FPS at 1080p.

But now I'm in the same boat for some newer games, and I'd also like to step up to 1440p. Looks like that's going to have to wait until Bitcoin tanks again...
 
I too know this pain. Upgraded to 1440p 144hz monitor when I had my rx 480.. That started the path to $400 plus GPUs on which I find myself today. Problem is once you go 1440 its tough to go back. I'm fine with 60 fps even if the monitor is set to 144hz but that higher res is so nice. Right now I'm on a 2060 super and 1440p is fine. Even Cyberpunk with RTX and optimized settings. Yes lows in the high 30s but turn rtx off and closer to 60 lows is achievable on this 400 dollar gpu. What really blows is the 3060ti is an excellent 1440p card for price point it is supposed to have and of course its a unicorn too.
 
i went from 1440P back to 1080P because the vertical alignment monitors just looked better than TN.
I don't even care that I switched back to 1080P image looks better.
 
Back
Top