4060 Ti 16GB due to launch next week (July 18th)

How quickly Nvidia has turned pc users into console users. Only want to talk about weaker hardware using upscaling.
Last time I checked, AMD has nothing to compete against Nvidias top end. Like not even remotely close in performance.

I don't like this cycle from either Nvidia or AMD, but to suggest Nvidia is the company with the weaker hardware is ludicrous at best and fanboy trolling at worst.
 
Last time I checked, AMD has nothing to compete against Nvidias top end. Like not even remotely close in performance.

I don't like this cycle from either Nvidia or AMD, but to suggest Nvidia is the company with the weaker hardware is ludicrous at best and fanboy trolling at worst.
At least Nvidia has released pretty much all of their new products. Can't say the same for the supposed competition.

Doesn't make Nvidia products a good value, but they are at least out there. AMD fans wonder why their favorite "underdog/cares about us/not evil company" continues to lose market share.
 
At least Nvidia has released pretty much all of their new products. Can't say the same for the supposed competition.

Doesn't make Nvidia products a good value, but they are at least out there. AMD fans wonder why their favorite "underdog/cares about us/not evil company" continues to lose market share.
Not just lose market share, but deliver consistently weaker hardware at equally inflated prices as Nvidia. And they still have the balls to complain about Nvidias pricing when the competition has fuck all to compete with.
 
The only thing this card manages to do successfully is prove that the whole lineup is memory starved.

The 8GB should never have existed and this card should be priced $50 cheaper than the 8GB version currently sells for.
 
Using a large suite of game average is probably not the way to go to compare card with only VRAM change, massing down the moment when it matter a lot by all the time it does not.
The only thing this card manages to do successfully is prove that the whole lineup is memory starved.

Conclusion of TPU review:
  • No significant performance gains from 16 GB VRAM

relative-performance-3840-2160.png


Hard to know if it proves the lineup being memory starved or the 16GB edition proving the lineup being memory bandwith and raw GPU power starved..... Because even with 16GB does not beat a 2080TI(the fact it was under the 8GB 3070 most of the time did not leave much doubt)

Maybe TPU was really creative to find setting (or more so part of the games) were the 8GB edition goes quite low in FPS but exactly the same with the 16GB, but outside native 4k with RT on (almost irrelevant for the 4060s) they have no difference, even in Hogwart, plague tale requiem, jedi survivor, even in under 60fps scenarios.
 
Last edited:
Conclusion of TPU review:
  • No significant performance gains from 16 GB VRAM

View attachment 585542

Hard to know if it proves the lineup being memory starved or the 16GB edition proving the lineup being memory bandwith and raw GPU power starved..... Because even with 16GB does not beat a 2080TI(the fact it was under the 8GB 3070 most of the time did not leave much doubt)

Maybe TPU was really creative to find setting (or more so part of the games) were the 8GB edition goes quite low in FPS but exactly the same with the 16GB, but outside native 4k with RT on (almost irrelevant for the 4060s) they have no difference, even in Hogwart, plague tale requiem, jedi survivor, even in under 60fps scenarios.
Average remains the same but 1% lows and visual quality is 20-30% better as it doesn’t suffer from the texture swapping issues the 8gb model does.
 
Conclusion of TPU review:
  • No significant performance gains from 16 GB VRAM

View attachment 585542

Hard to know if it proves the lineup being memory starved or the 16GB edition proving the lineup being memory bandwith and raw GPU power starved..... Because even with 16GB does not beat a 2080TI(the fact it was under the 8GB 3070 most of the time did not leave much doubt)

Maybe TPU was really creative to find setting (or more so part of the games) were the 8GB edition goes quite low in FPS but exactly the same with the 16GB, but outside native 4k with RT on (almost irrelevant for the 4060s) they have no difference, even in Hogwart, plague tale requiem, jedi survivor, even in under 60fps scenarios.

The only exception is The Last of Us, which gains +19% at 1440p and +21% at 4K. F1 23 gains 10% at 4K

(Also he doesn’t state it explicitly but RTX on favors the 16gb card in few games)

rt-doom-eternal-2560-1440.png
rt-doom-eternal-3840-2160.png
rt-far-cry-6-3840-2160.png
rt-resident-evil-4-3840-2160.png
 
texture pop in is not captured in fps metrics.
And I am not sure testing with a 13900k is realistic in that regard (you can probably stream texture on the fly quite well with that kind of cpu) and mass down the lack of vram more than a more realistic match for a xx60 card like a 5600x-12600k type of system, but there was significant recent patchs to that game (went from 1.0x to 1.1 last month), from vram management to texture streaming option-capability added, we would need to see how much it is an issue
 
Last edited:
And ? the price is also significantly higher.

Considering the 3060 ti can be found at $350
https://www.newegg.com/zotac-geforce-rtx-3060-ti-zt-a30610h-10mlhr/p/N82E16814500518

View attachment 585398

Both 4060ti are quite expensive for what they are, the price will have to fall down to move.
That's a 1440p chart. I was considering It's 4k performance increase. If it goes on sale it would be appealing to a lot of gamers. The 3060Ti is not even a consideration for 4k because of It's small VRAM right? (is the 3060ti 8 or 12gb? )Personally the 4070Ti performance is more my target because It's about equal to a 3080Ti maybe a bit faster.
 
That's a 1440p chart. I was considering It's 4k performance increase.
not sure how relevant is native 4k for 60s series card (the stronger 3070 in absolute and more so relative to the game of 2020 was not seen much as a native 4k card), but seem quite modest and the argument struggling to beat the $350 3060TI / $330 6700xt would hold:

relative-performance-3840-2160.png
minimum-fps-3840-2160.png


And more importantly they consider the 4060TI to be terrible, so saying that the $100 more expensive 16GB version do better in some case does not make ridiculous the claim that it is a terrible purchase at the current price.

When you look at the performance per dollar:
https://www.techpowerup.com/review/nvidia-geforce-rtx-4060-ti-16-gb/32.html

It is specially badly placed at all resolution being beat by a $800 4070TI is not normal, perf by dollar should go worst and worst up the stack and it is not like the $800 4070Ti was an high bar to pass.

It is a bit of strange place to be the 4060TI being too expensive for a 8GB in 2023 and the 16GB being too weak for $500 (and in a lot of case outside ultra high texture will not have the memory bandwith grunt force to run well games at resolution/raytracing that had issue with just 8GB of vram to start with.
 
Last edited:
Agree. Any card released for $350+ in 2023 should match or beat the 2080 ti

Interesting how nvidia is still a bit better at memory management. Look at the Last od Us TPU results for the 6800, 6700xt, and 2080ti.
At 1080p, those cards get 75, 72, and 67 fps respectively.

However, at 4k, those cards get 34, 28, 25 meaning that the 12 gb amd card falls behind the 11 gb nvidia card. It doesn't look to be architecture related as the 16 gb 6800 remains in the lead.

Perhaps a 10 GB nvidia card could have matched a 12 GB AMD card
 
not sure how relevant is native 4k for 60s series card (the stronger 3070 in absolute and more so relative to the game of 2020 was not seen much as a native 4k card), but seem quite modest and the argument struggling to beat the $350 3060TI / $330 6700xt would hold:

View attachment 585593View attachment 585594

And more importantly they consider the 4060TI to be terrible, so saying that the $100 more expensive 16GB version do better in some case does not make ridiculous the claim that it is a terrible purchase at the current price.

When you look at the performance per dollar:
https://www.techpowerup.com/review/nvidia-geforce-rtx-4060-ti-16-gb/32.html

It is specially badly placed at all resolution being beat by a $800 4070TI is not normal, perf by dollar should go worst and worst up the stack and it is not like the $800 4070Ti was an high bar to pass.

It is a bit of strange place to be the 4060TI being too expensive for a 8GB in 2023 and the 16GB being too weak for $500 (and in a lot of case outside ultra high texture will not have the memory bandwith grunt force to run well games at resolution/raytracing that had issue with just 8GB of vram to start with.
Seemed like clown Steve from hardware unboxed got much better results in his 4k tests of the 16GB version?
 
HUB makes the case for more VRAM again

Discusses texture quality issues in Halo Infinite & Forspoken and performance issue in Callisto Protocol (RT on & ultra textures)

On a positive note GDDR7 has the option for 12gb vram
So a potential 4060 ti refresh might have only a single version of the card (4160 ti ?) but 12gb vram

 
Wait, that game is on PC? I thought our SSD's aren't fast enough?
Microsoft fixed that with DirectStorage 1.2 which gave it up to a 40% boost over 1.1 on the same hardware.

It also added much better support for non NVME drives by allowing Buffered IO which is required for mechanical drives and SSD’s with cheaper controllers.

In addition Microsoft finally added support for the developer to put a fallback in place in the event the GPU can’t natively decompress the textures.

So 1.2 vastly expanded system compatibility while also greatly improving performance.

With 1.1, it would have been very limiting as to what systems could actually support the game. And certainly not large enough an audience to support a PC launch.
 
Microsoft fixed that with DirectStorage 1.2 which gave it up to a 40% boost over 1.1 on the same hardware.

It also added much better support for non NVME drives by allowing Buffered IO which is required for mechanical drives and SSD’s with cheaper controllers.

In addition Microsoft finally added support for the developer to put a fallback in place in the event the GPU can’t natively decompress the textures.

So 1.2 vastly expanded system compatibility while also greatly improving performance.

With 1.1, it would have been very limiting as to what systems could actually support the game. And certainly not large enough an audience to support a PC launch.
Oh good, it runs just fine on Linux. Actually runs faster on Linux, but a good deal. What's that about DirectStorage?
 
Oh good, it runs just fine on Linux. Actually runs faster on Linux, but a good deal. What's that about DirectStorage?

Still uses direct storage, but Linux has has a vastly better file system and does much better with moving lots of small files.

Regardless though direct storage is what lets the GPU fetch data directly from the storage. Linux has its versions but to my knowledge they are all proprietary. Vulkan doesn’t have it yet.
 
Still uses direct storage, but Linux has has a vastly better file system and does much better with moving lots of small files.
I don't think that's at play here.
Regardless though direct storage is what lets the GPU fetch data directly from the storage. Linux has its versions but to my knowledge they are all proprietary. Vulkan doesn’t have it yet.
The game would be using Vulkan since it's using VKD3D.
 
I don't think that's at play here.

The game would be using Vulkan since it's using VKD3D.
Here’s where this get really fucked up,
RTX IO integrates AMD’s Smart Access Storage, and NVidia brought the RTX IO API to Vulkan.
So while RTX IO and Smart Access Memory are both proprietary they are both open. RTX IO also wraps DirectStorage 1.2 so putting RTX IO in there you cover the lot.
 
I think the 16Gb 4060ti will have legs for years..

ppl b1tch about DLSS and Frame Generation, but I've seen both in action, and they work great. Jedi Survivor DLSS improves picture quality, and the frame gen was flawless. The lower powered cards those technologies will be even more beneficial.
 
I think the 16Gb 4060ti will have legs for years..

ppl b1tch about DLSS and Frame Generation, but I've seen both in action, and they work great. Jedi Survivor DLSS improves picture quality, and the frame gen was flawless. The lower powered cards those technologies will be even more beneficial.
Please post some screenshots. It was painfully obvious the last time I saw it.
 
When was that? Dlss has been great for years at this point. It also looks different in motion on a 4k screen. Frame generation works spectacularly well if your base fps isn't too low by all accounts from actual owners.
I was referring to the part about frame generation. Lower power cards aren't going to have high base rates and I'd like to see it since they said it was flawless.
 
I was referring to the part about frame generation. Lower power cards aren't going to have high base rates and I'd like to see it since they said it was flawless.
Usually depends of the resolution-setting you use, a 4060 run Jedi Survivor at 1080p around 66-68 fps with max details, that a really good range-case for DLSS3.

And frame generation work best at very low frame rate (i.e. going from 18 fps to 30 fps would probably be the best subjective gaming experience increase you can have) at least that what the blind test seem to show.
 
Here’s where this get really fucked up,
RTX IO integrates AMD’s Smart Access Storage, and NVidia brought the RTX IO API to Vulkan.
So while RTX IO and Smart Access Memory are both proprietary they are both open. RTX IO also wraps DirectStorage 1.2 so putting RTX IO in there you cover the lot.
They both use GDeflate, which is an open standard in DirectStorage. The NVIDIA and AMD specific interfaces are to optimize how GDeflate runs on their specific hardware.

https://github.com/microsoft/DirectStorage/tree/main/GDeflate
 
I think the 16Gb 4060ti will have legs for years..

ppl b1tch about DLSS and Frame Generation, but I've seen both in action, and they work great. Jedi Survivor DLSS improves picture quality, and the frame gen was flawless. The lower powered cards those technologies will be even more beneficial.

Please post some screenshots. It was painfully obvious the last time I saw it.
In this thread: https://hardforum.com/threads/modder-implements-dlss-3-frame-gen-dlss-2-in-jedi-survivor.2029003/
 
Back
Top