RTX 3060 12GB

euskalzabe

[H]ard|Gawd
Joined
May 9, 2009
Messages
1,478
Look, I acknowledge that his is for now not confirmed. However, it blows my mind that Nvidia is even considering doing this, again:

MANLI submits GeForce RTX 3080 Ti, RTX 3070 Ti, RTX 3060 and RTX 3050 to EEC - VideoCardz.com

What's the part that's really bothering me here?

1607440896472.png


You cannot have one GPU name with 2 different shader configurations. Using the same GPU name and differentiating both models by the memory entirely advertises that these are the same GPU, with different amounts of memory. That is a lie. When I bought my 1060 3GB, I was under the impression I was buying the same GPU with less VRAM, I mostly gamed at 1080p so it didn't matter that much, I figured. I had no idea what I was really buying was a 1050 Ti, renamed into a 1060. Then some sites started justifying it, saying, well if you game at 1080p the difference is not huge! And a few weeks later, Digital foundry showed exactly how important the differential was if you didn't have a lot of system RAM. Meaning, if you had 8GB (like I did at the time), those 3GB will definitely hinder the experience. If you have 16GB, less so, but still. By the time I found out about this, that the 6GB version was more powerful, and that I had not bought the performance tier that was advertised to me, I was already out of the return window. Now they're going to do the same with a 3050 Ti that's renamed into a 3060 with less VRAM. The deceitfulness is so plainly obvious, when you know where to look. Notice how the just released 3060 Ti is a 3070 GPU with lower shader count, but it's not marketed as a 3070 lite. Yet, Nvidia is playing this dishonest game at their most successful range, the x60 level. Further smoking gun? Notice how the 3060 12GB is the only Ampere GPU that has a 400 level designation. The 300 designation of each chip is normally the full, top version, the 200 is the lower tier Ti version. Yet somehow, the 3060 doesn't top at 300, tops at 400, then a cut down 300. 3584 shaders are not 3840 shaders. These are not two 3060 cards. This is unacceptable, it's false advertising, and I can't believe Nvidia is going to do this again. I hope someone sues them this time.

Of course now I know better than 3 years ago, and this means if I consider going Nvidia, the 12GB card is the only option, the only real 3060. However, this is the kind of shit that may very well drive me to a 6700 just for ethical reasons. DLSS won't be the only AI upscaling in town in 2021, and it's not supported generally by any game, so it's not such a powerful decision driving factor.

/rant
 
Last edited:
I wouldn't be surprised if they jack up the prices too.
Nvidia is just one of those companies that GOT LUCKY with their engineers.

They got popular and powerful, but ...do they deserve it at this point in time?

When it comes down to it, ethics and honor MATTERS.
It certainly hinders profit in the short-term, but long-term success does in fact hinge on honor, integrity, and true intelligence.

Not cheap marketing cleverness, supply-and-demand manipulation, and hype.
Those provide a quick buck, but in the end, they will never deliver those long-term results that truly powerful innovators demand.
 
I wouldn't be surprised if they jack up the prices too.
Nvidia is just one of those companies that GOT LUCKY with their engineers.

They got popular and powerful, but ...do they deserve it at this point in time?

When it comes down to it, ethics and honor MATTERS.
It certainly hinders profit in the short-term, but long-term success does in fact hinge on honor, integrity, and true intelligence.

Not cheap marketing cleverness, supply-and-demand manipulation, and hype.
Those provide a quick buck, but in the end, they will never deliver those long-term results that truly powerful innovators demand.
Seems to be working out for them. They could probably just drop the gaming market and barely feel it.
 
They count 'sold to miners' as 'Gaming' -- dont be fooled
Not entirely accurate - neither Bitcoin nor Ethereum benefit so outrageously as they used to from GPUs. The former is more profitable with specific hardware (ASICs) and since the latter switched to proof-of-stake, there is way less incentive to use GPUs for mining.
 
Not entirely accurate - neither Bitcoin nor Ethereum benefit so outrageously as they used to from GPUs. The former is more profitable with specific hardware (ASICs) and since the latter switched to proof-of-stake, there is way less incentive to use GPUs for mining.
Might give this a read. https://ethereumclassic.org/blog/2020-11-27-thanos-hard-fork-upgrade

At current epoch (372) the DAG size is 3.91 GB. 4GB GPUs are getting obsoleted while they still make up a significant amount of ethash hashrate. Activating this change at epoch 376 (for example), would reduce the DAG size from 3.94GB to 2.47GB. With the reduced growth rate, 4GB GPUs will remain supported for an additional 3+ years.
 
NVIDIa doing their Ti dance again. Get everyone excited about their new cards, short supply them, make money, then announce the replacement cards to come out in a couple months.....
 
Was anyone NOT expecting Ti cards? I don't even remember the last family of GPUs that didn't have one.
I don't think MrGuvernment takes issue with the existence of Ti cards. Rather, he mentions Nvidia's "Ti dance": a Ti moniker that is inconsistent in what it means (clearly seen in the x60 range) and its positioning within the lineup (since the x70 and x80 Ti cards are about to supplant its non-Ti equivalents 3 months after release, perhaps signaling Nvidia's surprise at AMD's showing this time around).

Ti or no Ti, however, a tier is a tier. If you change the number of shader cores, you change the performance, you change the tier. That 3060 Ti is really the full fat 3060, while that 3060 is a 3050 Ti in disguise.
 
. That 3060 Ti is really the full fat 3060, while that 3060 is a 3050 Ti in disguise.
Considering both the 3060TI and the 3070 are GA104 with the exact same ram solution, isn't the 3060TI a watered down 3070 in disguise and the GA-106 the real 3060s ?

I was under the impression I was buying the same GPU with less VRAM,

That will sound a little pity, but also with the industry about never creating exact same card with simply VRAM quantity changing (RX 580 had different memory speed, it seem that there is almost always something else being changed) it make it harder to know how much value there is really in it.
 
Considering both the 3060TI and the 3070 are GA104 with the exact same ram solution, isn't the 3060TI a watered down 3070 in disguise and the GA-106 the real 3060s ?
Yes and no. Problem is Nvidia has been meddling with what their lineup means for years now. It used to be:

102 was 80 series
104 was 70 series
106 was 60 series
107 was 50 series

Now you have more segmentation:

102-300 is a 3090
102-250 is a 3080 Ti
102-200 is a 3080
103-300 is a 3070 Ti
104-300 is a 3070
104-200 is a 3060 Ti
106-400 is a 3060 12gb
106-300 is a 3060 6gb
107-XXX is a 3050

Seeing this:
- You could say the 300 per GPU tier is the full die, like the 3090 and 3070. But then the 3080 is a 200 level. The 3060 is a 400 level.
- You could say all tiers top at 300 level for full die. But the 3060 makes up a magical 400 level not present anywhere else. This suggests its been artificially inflated, and the 400 is actually 300. This is because...
- You could say the 200 level is the cut down version, same GPU but sold as lower tier brand. But the 3080 Ti is 200, it's not a 3070 Ti. And the 3070 Ti is its own 103 GPU (wtf?).

So basically, nothing makes any fucking sense. Nvidia is playing around with what they sell you:
You want a full die 102 GPU? That could be an x90 or an x80 series, who knows.
You want a full die 104 GPU? That could be a x70, but maybe it's a whole new GPU classification at 103!
You want a full die 106 GPU? That could be a 3060, but we'll also sell you a 3060 with the same name but a different shader count and performance.

They are dominant in the market, there is no reason to play these games. None at all.

$150 for 50 series
$200 for 50 Ti series
$300 for 60 series
$400 for 60 Ti series
$500 for 70 series
$600 for 70 Ti series
$700 for 80 series
$800 for 80 Ti series

And I don't count the 90 series since $1500 for those is insanity territory. Having such a mostly clean lineup (arguable, but still) why meddle with the 60 series? Why lie to consumers? Why advertise a 60 series card that is not actually a 60 series card? They have the consumers. They have the products. They have the market segmentation. Lying to consumers and fucking around with SKUs that conceal what is being sold is how consumers hate you and abandon you the second there's an alternative. AMD has improved, but I still don't think they're super competitive as of now (not on raster performance, but everything else is lacking in their cards). But let me remind you Intel is coming in 2021, and even if they (probably) don't succeed amazingly on their first release, they're clearly serious about GPUs now. A 3 way race won't let Nvidia remain comfortable for long. And seeing these practices continue - and get worse - when the time comes, I and many others will drop Nvidia like a hot rock the second there's a mildly attractive alternative.
 
why meddle with the 60 series? Why lie to consumers? Why advertise a 60 series card that is not actually a 60 series card? They have the consumers. They have the products. They have the market segmentation. Lying to consumers and fucking around with SKUs that conceal what is being sold is how consumers hate you and abandon you the second there's an alternative.

They've been doing that shit since they re-birthed the Ti series back in the Kepler days.

GTX 660 Ti ring a bell? The GTX 660 was 106, while the Ti was a highly-cut cut 104.

Then they continued that same shit with the Kepler refresh (GTX 760 = GTX 670 rebrand).

Nvidia has typically been using the Ti rebirthed version to indicate a heavily-cut next-chip-up. If so, then the 3060 being based on Ampere 106 would be keeping up with their consistent naming conventions.

But now it's been heavily-abused for un-cut cards. There's no stopping them (along with adding a Super additional rebrand)
 
Last edited:
Nvidia has typically been using the Ti rebirthed version to indicate a heavily-cut next-chip-up. If so, then the 3060 being based on Ampere 106 would be keeping up with their consistent naming conventions.
My point exactly. The cut version of 106 should be a 3050 Ti, not a 3060. The cut 104 is sold as a 3060 Ti. Instead of a full 106 sold as 3060 and a cut 106 sold as 3050 Ti, you now have:

A full fat 106, sold as 3060 12gb
A cut down 106, sold as 3060 6gb
A further cut down 106, sold as 3050 (Ti, I'm guessing, but we don't know yet)

Therein lies the problem. Extra segmentation at x60 range, while there is no name change, sold as the "same" 3060 which is untrue, but just with a memory differential. Full 106 ain't cut down 106, not by a long shot.
 
$150 for 50 series
$200 for 50 Ti series
$300 for 60 series
$400 for 60 Ti series
$500 for 70 series
$600 for 70 Ti series
$700 for 80 series
$800 for 80 Ti series
GDDR6X iirc is super expensive though, I would bet the 80 Ti is going to be at least $900 or even $1k+ especially after the preliminary 6900XT review results show only slight improvement over the 3080s (though that could change with driver and FW improvements).
 
GDDR6X iirc is super expensive though, I would bet the 80 Ti is going to be at least $900 or even $1k+ especially after the preliminary 6900XT review results show only slight improvement over the 3080s (though that could change with driver and FW improvements).

They've been waiting for the 2GB density chips (the same trick the entire RdNA2 lineup uses.)

It will only add another 100 to the price.
 
They've been waiting for the 2GB density chips (the same trick the entire RdNA2 lineup uses.)

It will only add another 59$%)-80 to the price.
A whole 59$! how could nvidia add 89$ to the price? thats ridiculous they would consider adding 129$ to an already expensive card! I sure hope theres availability with that $199 premium :p

Got it so we're waiting on micron then to finish the 6x 2gb density chips so they dont have to change up the layout and cooling design, and being the 6000 series is using the gddr6 which has the 2gb chips makes sense why they're ahead vmram wise.
A little glad now I wasn't able to get my hands on one, I'm not one to put in too much effort though if I can get it, great, if not it'll be available at some point in the future.
 
They've been waiting for the 2GB density chips (the same trick the entire RdNA2 lineup uses.)
I wonder then what "trick" AMD was using with Polaris, to still sell their GPUs for equal or less money while consistently having more VRAM. Whatever it is, I'd like Nvidia to learn it already.

I'm expecting a $299 3060, I'd guess that'll be at 6GB and a 12GB would go for $349. Considering the different GPUs, not a great deal. If AMD can get their 6700 12GB at $299, that'll be a very tight decision for me to make.
 
I wonder then what "trick" AMD was using with Polaris, to still sell their GPUs for equal or less money while consistently having more VRAM. Whatever it is, I'd like Nvidia to learn it already.

I'm expecting a $299 3060, I'd guess that'll be at 6GB and a 12GB would go for $349. Considering the different GPUs, not a great deal. If AMD can get their 6700 12GB at $299, that'll be a very tight decision for me to make.
The 'trick' is that they're using the GDDR6 ram in all of the high end 6000 series, the 3080/3090 are the only ones using the GDDR6X currently.
So not only is it cheaper the 2gb densities are already readily available, thats why the 3070 and 3060ti are much cheaper as well.
I'm sure nvidia didn't want to overshadow their flagship gpu in amount of vram is why those are 8gb (also why they're likely slated for a upgrade to 16gb if/when the 3080/3080ti upgrade to 20gb happens).


Nevermind just reread an noticed you were talking about the RX500 series
 
Last edited:
I wonder then what "trick" AMD was using with Polaris, to still sell their GPUs for equal or less money while consistently having more VRAM. Whatever it is, I'd like Nvidia to learn it already.
IIt's called a 256-bit memory bus.

The GTX 1060 only has 192 bits available.

That trick is called "lowering your GPU margins," a trick you have to use when you are behind on absolute performance, perf/watt and bandwidth efficiency

AMD has actually been raising prices with RDNA1 (5500 has been decimated by the GTX 1650 Super, and then the 5600 barely matches the $300 RTX 2060.) As a result, these have only sold well when NVIDIA discontinued Turing last-quarter (you have no choice).

AMD is still charging you an $80 premium for the 2GB memory on the 6800; they have decided to stop losing money on folks like you.
 
Last edited:
AMD is still charging you an $80 premium for the 2GB memory on the 6800; they have decided to stop losing money on folks like you.
A) True, AMD has been raising their prices lately. Not as insanely as nvidia.
B) calm down with the assumptions about what type of buyer I am. I have no issue spending more money, but not when it’s obvious that nvidia is nickel amd dining me.
C) if you think 6GB VRAM at $300 is acceptable in 2021, we live in different tech worlds.
 
C) if you think 6GB VRAM at $300 is acceptable in 2021, we live in different tech worlds.


And yet, AMD expects the exact same thing from you.

Stop playing-up AMD as somehow saving us, when they are matching the same SKUs in NVIDIA's lineup.

https://www.newegg.com/asus-radeon-rx-5600-xt-dual-rx5600xt-t6g-evo/p/N82E16814126428?Description=radeon rx 5600 xt&cm_re=radeon_rx 5600 xt-_-14-126-428-_-Product

REPEAT AFTER ME: THERE I NOTHING SPECIAL ABOUT AMD NAVI. As far-as AMD is concerned, Polaris is dead-and-buried.

Even the RX 5700 has been all-but-discontinued (leaving you the 8GGb XT, or the 6GB cut).

https://technosports.co.in/2020/10/04/amd-to-stop-focus-on-radeon-rx-5700-gpu-series/

Repeat after me: AMD Navi wants their goddamned profits.
 
Last edited:
And yet, AMD expects the exact same thing from you.

Stop playing-up AMD as somehow saving us, when they are matching the same SKUs in NVIDIA's lineup.

https://www.newegg.com/asus-radeon-rx-5600-xt-dual-rx5600xt-t6g-evo/p/N82E16814126428?Description=radeon rx 5600 xt&cm_re=radeon_rx 5600 xt-_-14-126-428-_-Product

REPEAT AFTER ME: THERE I NOTHING SPECIAL ABOUT AMD NAVI. As far-as AMD is concerned, Polaris is dead-and-buried.

Even the RX 5700 has been all-but-discontinued (leaving you the 8GGb XT, or the 6GB cut).

https://technosports.co.in/2020/10/04/amd-to-stop-focus-on-radeon-rx-5700-gpu-series/

Repeat after me: AMD Navi wants their goddamned profits.
You're saying AMD expects the same, but then point to their cards that are going out, not incoming. We have yet to see what AMD will offer at $300. Either way, none of these cards really have enough HP to drive higher resolutions, so extra memory may or may not make much difference, but core counts will, which was the point of this thread. I'm not saying AMD isn't similar, they did the same crap with the rx 560, even worse retailers weren't even giving specs so you really had no clue. At least AMD stepped in and forced AIBs and stuff to post specs, but they could have avoided the confusion completely by not using the same name. Hopefully they don't do it again, but I feel it's not the last time we'll see this.
 
Stop playing-up AMD as somehow saving us, when they are matching the same SKUs in NVIDIA's lineup.
Good grief buddy, you keep reading way too much into what I’m saying. I said AMD used to give us more hardware for similar money. I wondered IF they would do the same now. Nowhere did I claim AMD would be our value savior anymore. I don’t expect them to be. And I still think $300 for 6GB is stupid in 2021. If none of the 2 players accept that, I just won’t buy anything this generation.

Judge what I say, not what you interpret from what I said.
 
My point exactly. The cut version of 106 should be a 3050 Ti, not a 3060. The cut 104 is sold as a 3060 Ti. Instead of a full 106 sold as 3060 and a cut 106 sold as 3050 Ti, you now have:

A full fat 106, sold as 3060 12gb
A cut down 106, sold as 3060 6gb
A further cut down 106, sold as 3050 (Ti, I'm guessing, but we don't know yet)

Therein lies the problem. Extra segmentation at x60 range, while there is no name change, sold as the "same" 3060 which is untrue, but just with a memory differential. Full 106 ain't cut down 106, not by a long shot.

Even a 3060ti, 3060, and 3060 LE. Or just call the middle one a 'Super', even though a hate that moniker.

Anything other than simply 3060 12GB / 6 GB when they have different specs beyond that.
 
C) if you think 6GB VRAM at $300 is acceptable in 2021, we live in different tech worlds.
+1, especially if it is slower gddr6. The 6 GB 3060 'LE' and 6 GB 6700 need to be $250 for sure.

Looks like a big drop off to the 3050. 4 GB 128 bit gddr6 better be closer to $150.

The $200 price would be a perfect spot for an 8 GB 3050ti gddr6x. That would make a great mainstream card and finally retire the RX580 as the budget king.
 
I was able to get a 3070 and I'm waiting for this 3070ti. I have til february to start my step up program, if they dont put out a 3070ti before then I'll do the 3080
 
Perhaps this solves my indecision between AMD and Nvidia this upgrade round. Techspot/HU is one of the most honest and unbiased review sites/channels out there. This crap is not OK and I cannot in good faith give money to its perpetrators. I remain amazed as to why Nvidia does this stuff, throwing rocks at its own roof. They’re already winning! They have nothing to gain from these sorts of tactics.

https://www.techspot.com/amp/news/87946-ugly-side-nvidia-rollercoaster-ride-shows-when-big.html

 
Last edited:
Perhaps this solves my indecision between AMD and Nvidia this upgrade round. Techspot/HU is one of the most honest and unbiased review sites/channels out there. This crap is not OK and I cannot in good faith give money to its perpetrators. I remain amazed as to why Nvidia does this stuff, throwing rocks at its own roof. They’re already winning! They have nothing to gain from these sorts of tactics.

https://www.techspot.com/amp/news/87946-ugly-side-nvidia-rollercoaster-ride-shows-when-big.html



Started a new thread here:
(Edit)

Actually one was already here
https://hardforum.com/threads/nvidia-today.2004967/#post-1044847477
 
Last edited:
Further confirmation that the 3060 6GB is nothing more than a renamed 3050Ti.

https://videocardz.com/newz/nvidia-...ed-till-february-rtx-3060-12gb-6gb-in-january

The 12GB model is the only real 60 card. I would hope price to be $299 but I can see Nvidia charging $349, as it seems the 3060 Ti took the price slot of the 2060 Super and the 3060 will take the one of the 2060 post-super launch. That means the 3050 range (despite being called 3060 6GB) is now $299 which is bonkers for a range that is supposed to be sub $200.

I can’t wait to see what AMD releases. I’m looking to spend $300 and at this price there’s not much point in looking at RT as a feature - 3060 Ti already barely does the minimum and a 3070 is the basic guarantee of good RT performance. Non of the sub-Ti cards will have respectable RT performance, which makes AMD’s value proposition at this price range an interesting outlook.

Release for both AMD and Nvidia is early January, so we should have lots more leaks soon.
 
Further confirmation that the 3060 6GB is nothing more than a renamed 3050Ti....

Why? There is a large gap between the 3060 and 3050. Likely the 3050ti will get more cores and 8 GB on a 128 bit bus.

The 3060 6 GB will still be faster 1080p high / 1440 med, while the 3050ti might be faster on a few 1080 ultra scenarios.

Its basically like the 5600xt vs the 5500xt 8 GB.
 
Its basically like the 5600xt vs the 5500xt 8 GB.
It is not the same, because the 5600 and 5500 you mention are sold as different GPUs, while the 12 and 6 GB Nvidia variants will be sold as if they were different memory configurations on the same GPU: a 3060. They are not.
 
They apparently won't give the RTX prefix to the '50 series.
An RTX 3050 is still rumored to happen though. Probably too early to tell. Either way, wouldn't be much use, probably would perform as a GTX 1060 processing DXR through it's regular old shader cores.
 
It is not the same, because the 5600 and 5500 you mention are sold as different GPUs, while the 12 and 6 GB Nvidia variants will be sold as if they were different memory configurations on the same GPU: a 3060. They are not.
I wasn't arguing that the 3060 GB was a deceptive, bad name. I just think there will be a 3050ti below it.
 
The 12GB model is the only real 60 card.
I feel like we went through all this already but

GA102 - 384 bit bus (3090)

GA102 - 320 bit bus (3080TI)
GA102 - 320 bit bus (3080)

GA104 - 256 bits bus (3070)
GA104 - 256 bits bus (3060TI)

GA106 - 192 bits bus (3060 12 gb)
GA106 - 192 bits bus (3060 6gb)

GA107 - 128 bits bus (3050)

For me it is really unclear if it happen to be a 3050TI with a GA106 on a 192 bits bus, if it make that a 3060 or that make the 3060 6GB a 3050, and if the 3060 6GB is a 3050 card how can the 3060 12gb be a 3060 ?

The 3060 12gb same chips, same bus, just a little bit more cuda, rt core and frequency, that is has much same family with small different tier that it can get no ?

Imo the 3060TI is on the same tier than the 3070 specs wise and benchmark seem to tell the same story, I would expect the same to occur again here.
 
Yes, there are similarities to argue it's a 3060 family card, but Nvidia doesn't do that with the 70 or 80 range, they call those cut-down same-chips the immediately lower level + Ti cards.
I am not sure to follow (it is such strange semantic to wrap my mind around), but the 3080 lite is being sold has a 3080 and an less lite will be sold has a 3080TI.

Almost all GPU sold are a chip being cutted down to allow better yield no ? It does not seem to have a strong logic enforced much anywhere the line in that Ampere family stack.

I think they call that 3060 12 gb a 3060 super and the other one just 3060 and everything would make sense, the issue is the exact same name, not being put both on the same xx60 tier imo.
 
Last edited:
Back
Top