4060 Ti 16GB due to launch next week (July 18th)

I think AMD is still playing the same game with Nvidia, in that they want to see what Nvidia will do first. The RX 7600 is a trash GPU that even AMD admitted they had to lower the price before release. The 6650 XT performs better while being around $250, much like how the RTX 3060 Ti performs better than the 4060 Ti and it cheaper too. I think it's clear that both AMD and Nvidia don't want to move the bar higher than last gen so that consumers don't gain much value from the current situation. You can buy a 6800 XT for $480 right now on Amazon, so it would be worth it for AMD to further lower prices just to make it official.
Nvidia has already done what they are going to do. I mean, their entire line is out. What AMD is doing----is AMD wants people to buy the old cards, because they ordered tons of them and then the pandemic demand started drying up----right about when those tons of cards finally became available to consumers. Its exactly why we don't have RX 7700 and 7800 right now.

And Nvidia created an artificial price gap with the new products-----so that people will consider older cards, in the $400 - $600 range. However, sales are happening and its getting real tight. I mean right now you can get a 4070 for less than $550 with Newegg's 12% with Zip checkout and then stack another 4% from Bing. Not a lot of room left for the old 3080's. Especially when the 4070 is under 200 watts!. And 3070/ti ought to be eating the 4060 ti's lunch on price/performance. That said, 4060 ti are only about 150 watts, and that matters to ITX builders.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Nvidia has already done what they are going to do. I mean, their entire line is out. What AMD is doing----is AMD wants people to buy the old cards, because they ordered tons of them and then the pandemic demand started drying up----right about when those tons of cards finally became available to consumers. Its exactly why we don't have RX 7700 and 7800 right now.
I would think AMD and Nvidia both want their old cards sold out. The lack of a 7700 and 7800 kinda seems like AMD still doesn't want to fight Nvidia and create a price war, which is the only way AMD is going to sell GPU's. Crypto is dead and they can no longer just be in Nvidia's shadows and still make a hearty profit, but with the AI craze I think AMD believes there's yet more to come. AMD not releasing the 7700 and 7800 is doing Nvidia more of a favor than their own AIB's.

And Nvidia created an artificial price gap with the new products-----so that people will consider older cards, in the $400 - $600 range. However, sales are happening and its getting real tight. I mean right now you can get a 4070 for less than $550 with Newegg's 12% with Zip checkout and then stack another 4% from Bing. Not a lot of room left for the old 3080's. Especially when the 4070 is under 200 watts!. And 3070/ti ought to be eating the 4060 ti's lunch on price/performance. That said, 4060 ti are only about 150 watts, and that matters to ITX builders.
Right now I can't find a cheap 3080. They're going for $900 where as a 4070 is going for $600. You can find used 3080's on Ebay for less than $400, which tells me that retailers have run out of 3080's. Where as you can find a RX 6800 for around $450 new, which isn't much cheaper used on Ebay. Seems like Nvidia is having great success selling off their RTX 30 series, compared to AMD selling off their RX 6000 series, at least on the high end models. Also, I don't think many people care about power consumption. This isn't a laptop where this matters.
 
They do. No one likes to sweat while gaming or have to pump the air conditioning because their computer is putting out as much heat as a space heater. Plus in many countries electricity is now quite expensive.
The thought of not having to run AC is appealing…
 
I mean my central air is going to be on regardless, but that still doesn't stop the office room from being 10F warmer.
My office is in the part of the house that was built some time in the 1920’s…
I’m lucky I can power my PC on with out blowing a fuse.

I know my needs require I replace wiring and panels, but I’m torn between wanting to take on a major renovation and not getting Divorced because I guarantee the wife and I will have very very different ideas of what needs to happen should I start that.
 
They do. No one likes to sweat while gaming or have to pump the air conditioning because their computer is putting out as much heat as a space heater. Plus in many countries electricity is now quite expensive.
Hopefully they don't have a 13000K series chip in their sig. :sneaky:
 
They do. No one likes to sweat while gaming or have to pump the air conditioning because their computer is putting out as much heat as a space heater. Plus in many countries electricity is now quite expensive.
As someone who still has a FX 8350 laying and currently has a Vega 56 in his PC, it makes no difference. PC still pumps heat and still warms the room. On the plus side, it makes the room warm enough in the winter to not need to turn the heat on. If you have a 13900K with a RTX 4090, you're in no position to talk. Unless you game 24/7 all year long, it won't budge the electric bill by much. If you own a RTX 4090 and game 8.5 hours a week and your electric cost is $0.14 per kilowatt hour then you're paying $32.76 a year for a 450 watt graphics card. The RTX 3080 graphics card is around 340 watts under load. That's 24% less power consumption compared to the 4090, which makes the 3080 $25 a year. The RTX 4070 uses 193 watts which means it uses 57% less power than a 4090, cause I'm using the 4090 as a base. That means the RTX 4070 uses $18.67 per year, if you game 8.5 hours per week. This is also assuming that the graphics card is being 100% utilized. Like if you're playing Minecraft with Vsync on, you won't be drawing the max power the GPU can handle. It also depends on where you live, which as an NJ resident I pay $0.14 per kilowatt hour but Hawaiians pay 30 and Californians pay 20. On the flip side, most states pay less than $0.10 per kilowatt hour. On the European side the highest is Romania €0.36 per KWh while the lowest is Finland (€0.115 per KWh). Most European countries are bellow (€0.23 per KWh). Higher than what Americans pay but not high enough to matter that much. I haven't even gotten into undervolting. It really doesn't matter.
 
4060 Ti 16GB seems like a waste of resources if anyone asks me.
Not if you are looking to do art, 3D modeling, or any prosumer work loads. Team them up with the creator drivers and not the game ready ones and you have yourself a solid alternative to any of the RTX2000 or 4000 series workstation cards at a fraction of the price.
An entry level workstation card with 20GB ram and 1/2 the GPU performance starts in the $1200 range.
I’d put this in a workstation over a T1000 any day given the choice.
 
No samples, no drivers, no reviews.

https://www.techpowerup.com/311348/nvidia-geforce-rtx-4060-ti-16gb-review-not

We talked to all the partners, and friends in the industry. We learned that neither NVIDIA nor the partners are sampling the RTX 4060 Ti 16 GB. We did try to arrange samples through back-channels, which turned out to be a bust, too, nobody wanted to touch these cards. To prevent those reviewers who could somehow score cards in partnership with retailers, NVIDIA ensured there was no driver available until earlier today. Without drivers, there's no way for anyone to test the card, and it shows—we've scoured the web, and nobody has a review.
 
It's worth noting that this isn't the first time Nvidia hasn't directly sampled press and influencers. The RTX 3090 Ti wasn't sampled, nor were the RTX 3080 12GB and GTX 1630. Other GPUs in the past have appeared without having the full launch treatment as well, things like the RTX 2060 12GB or GTX 1060 5GB.

What these GPUs all generally have in common is that they're not perceived as being marquee releases. They're less important launches, often of hardware that might not get too favorable of a reception from reviewers.

https://www.tomshardware.com/news/nvidia-geforce-rtx-4060-ti-16gb-goes-on-sale
 
Well this is an annoying internal debate.. 4060 16gb OC for 499.99 or a 4070 12gb for 599.99. Is the hundo difference worth the swing between: more-ram-128bit-memory or less-ram-192bit-memory.
 
Well this is an annoying internal debate.. 4060 16gb OC for 499.99 or a 4070 12gb for 599.99. Is the hundo difference worth the swing between: more-ram-128bit-memory or less-ram-192bit-memory.
For $100 more the 4070 is the better deal. If the difference had been $200 or more then that would have been something to think about

12gb is fine upto 1440p medium/high, which suits the 4070 fine
It seems to me the one job of 4060 ti 16gb is to make the 4070 more appealing

The 4060 ti 16gb is not worth more than $380, imo.
 
It's pointless, get a 4070 or just drop down to the vanilla 4060 and save your money until next gen.
 
No samples, no drivers, no reviews.

https://www.techpowerup.com/311348/nvidia-geforce-rtx-4060-ti-16gb-review-not

We talked to all the partners, and friends in the industry. We learned that neither NVIDIA nor the partners are sampling the RTX 4060 Ti 16 GB. We did try to arrange samples through back-channels, which turned out to be a bust, too, nobody wanted to touch these cards. To prevent those reviewers who could somehow score cards in partnership with retailers, NVIDIA ensured there was no driver available until earlier today. Without drivers, there's no way for anyone to test the card, and it shows—we've scoured the web, and nobody has a review.
That is teh suck. Super lame on Nvidia. They want consumers to buy it blind.
 
That is teh suck. Super lame on Nvidia. They want consumers to buy it blind.
I mean lets be honest, Nvidia is a company that wants to make money. They do not care how much they fleece their customers.....why? Because they know people will buy them no matter the price/performance etc.

I am not Anti-Nvidia either, just stating the obvious.
 
That is teh suck. Super lame on Nvidia. They want consumers to buy it blind.
It's more likely that they just don't care how well or badly it does. At this point their focus is squarely on other endeavors and gaming will be taking a massive back seat for the foreseeable future.
 
It's more likely that they just don't care how well or badly it does. At this point their focus is squarely on other endeavors and gaming will be taking a massive back seat for the foreseeable future.
Not sure gaming will take a back seat as much as NV won't cater to the mid to low end anymore. Why chase that volume when it can be made up from selling fewer, but much higher profit, AI/professional cards and x70/x80/x90 gaming GPUs. I suspect they are going to leave the low end to AMD and Intel.
 
Not sure gaming will take a back seat as much as NV won't cater to the mid to low end anymore. Why chase that volume when it can be made up from selling fewer, but much higher profit, AI/professional cards and x70/x80/x90 gaming GPUs. I suspect they are going to leave the low end to AMD and Intel.
Sad but true. It's depressing to think about unless Intel has a more solid driver for their offerings. And by then I imagine they might play by the same rules as AMD has been and price their products close to AMD because why not.
 
It's more likely that they just don't care how well or badly it does. At this point their focus is squarely on other endeavors and gaming will be taking a massive back seat for the foreseeable future.
why even bother with the release? Just amazing how much they are turning even suckier.
 
why even bother with the release? Just amazing how much they are turning even suckier.
Because they can, they know that it occupies a slot against their competitors even if not a good value proposition. The point is to make sure your competitors don't
 
Because they can, they know that it occupies a slot against their competitors even if not a good value proposition. The point is to make sure your competitors don't
As I've said earlier I think it is a shit value for a gaming card, but as an option for a workstation, I can see it as a valuable alternative to what is out there currently.
Like I would need to see benchmarks, but for somebody who is learning, CAD, 3D design, or wants to get into game development, drafting, etc, or toy with AI stuff at home it is a solid entry point to that market.
But for gaming, there are better options and this should be avoided if presented with any other option, at its current price.
 
4070 is a MUCH better GPU
while yes it is, but wouldn't getting the 4070 JUSTIFY the pricing that nvidia is implementing then? It is one of the things that irks me is that ppl are still buying the 4070's now because the 4060 series is bad value. Well guess what, when the 4070 was released it was ripped for bad pricing. 4060's release, oh, i'll just get a 4070 series instead which shows to nvidia that they can price their cards at these prices since users are still buying them!
 
  • Like
Reactions: kac77
like this
while yes it is, but wouldn't getting the 4070 JUSTIFY the pricing that nvidia is implementing then? It is one of the things that irks me is that ppl are still buying the 4070's now because the 4060 series is bad value. Well guess what, when the 4070 was released it was ripped for bad pricing. 4060's release, oh, i'll just get a 4070 series instead which shows to nvidia that they can price their cards at these prices since users are still buying them!
Indeed.

You could get 4070’s for close to $500, with the 12% off with Zip quad pay and then stack another 4% with bing cashback.

Zip promo is over, though.
 
Well this is an annoying internal debate.. 4060 16gb OC for 499.99 or a 4070 12gb for 599.99. Is the hundo difference worth the swing between: more-ram-128bit-memory or less-ram-192bit-memory.
Yeah, go for the 4070.
For $100 more the 4070 is the better deal. If the difference had been $200 or more then that would have been something to think about

12gb is fine upto 1440p medium/high, which suits the 4070 fine
It seems to me the one job of 4060 ti 16gb is to make the 4070 more appealing

The 4060 ti 16gb is not worth more than $380, imo.
It's not even the 4060 ti, it's just a 4060 with 16Gb ram vs 8Gb.
 
https://www.extremetech.com/gaming/rtx-4060-ti-16gb-benchmarked-by-msi-is-slower-than-8gb-version
Well how's that for a kick in the dick.

TLDR;
The denser memory modules result in lower overall performance compared to the 8Gb version.
On paper, it certainly is a kick in the dick; but the delta being a paltry 0.5% is just meh. I can't even be bothered to care. I've said it before, and I'll say it again. Anything outside of the 4090 is bad value, so why expend any energy at all on something that really is splitting hairs in this case.

Edit: I'm still sitting over here wondering when any real journalism will happen with the FSR/DLSS bullshit that has been ongoing and mostly ignored.
 
On paper, it certainly is a kick in the dick; but the delta being a paltry 0.5% is just meh. I can't even be bothered to care. I've said it before, and I'll say it again. Anything outside of the 4090 is bad value, so why expend any energy at all on something that really is splitting hairs in this case.

Edit: I'm still sitting over here wondering when any real journalism will happen with the FSR/DLSS bullshit that has been ongoing and mostly ignored.

Yeah, doesn't look like much of a big deal other than being mostly pointless. Maybe its good for people who need cheap AI cards.

As for the upscaler dramas, look into Nixxes refining their specs/requirements for AMD cards on the new Ratchet and Clank port.
 
Yeah, doesn't look like much of a big deal other than being mostly pointless. Maybe its good for people who need cheap AI cards.

As for the upscaler dramas, look into Nixxes refining their specs/requirements for AMD cards on the new Ratchet and Clank port.
I mean there they can make up the performance differences by using system ram. Not an option with consoles, when doing console development the limited memory puts a much greater need for fast storage speeds. Otherwise you have to cut textures accordingly.

Or is there more drama there than the removal of the need for nvme storage?
 
Yeah, doesn't look like much of a big deal other than being mostly pointless. Maybe its good for people who need cheap AI cards.

As for the upscaler dramas, look into Nixxes refining their specs/requirements for AMD cards on the new Ratchet and Clank port.
Do you mean how they have a 3060ti listed under the same tier as a 6800? Wouldn't that likely have to do with Direct Storage? Nvidia cards are able to do hardware decompression much better than similar AMD cards, do they not?
 
Last edited:
Back
Top