NVIDIA to Target $450 Price-point with GeForce RTX 4060 Ti / Review Round Ups

I agree with AdoredTV in that AMD gave up. In order for AMD to compete against Nvidia they would need to spend a lot of money. Money that would ultimately be a waste because Nvidia always has a faster card ready to beat AMD. If AMD beats Nvidia in pricing then it's a race to the bottom, which doesn't benefit either company. If AMD were to price their RX 7800 for $600, nobody would buy it, but price it down to $400 and Nvidia would drop their prices to match and thus AMD wouldn't get anymore sales. This is why I think AMD has an unwritten agreement with Nvidia to basically keep their prices relatively close to Nvidia's in order to drive GPU prices higher. Kinda like price fixing.
Yes, both AMD & Nvidia have responsibilities to their share holders & board partners. So they can't price too low.

Intel's dgpu business is still at risk if they cannot turn profitable

Radeon would persist in some form or the other because it is in playstation & xbox.

From consumer point of view, they need to vote with their wallets (eg: buy a used 3060 12gb/3060ti/6700xt or discounted 6800xt or a series S/PS5 or a 7900xt+/4080+)
 
Yes, both AMD & Nvidia have responsibilities to their share holders & board partners. So they can't price too low.

Intel's dgpu business is still at risk if they cannot turn profitable

Radeon would persist in some form or the other because it is in playstation & xbox.

From consumer point of view, they need to vote with their wallets (eg: buy a used 3060 12gb/3060ti/6700xt or discounted 6800xt or a series S/PS5 or a 7900xt+/4080+)

AMD/Nvidia to add... need to play the "know when to hold them, and know when to fold them" strategy... basically.. .when do they pull the plug on making previous gen products.. to "force" us peasants to have to go to RTX 4000 series / RX 7x000 XT/XTX and not able to get our hands on discounted previous gen
 
AMD/Nvidia to add... need to play the "know when to hold them, and know when to fold them" strategy... basically.. .when do they pull the plug on making previous gen products.. to "force" us peasants to have to go to RTX 4000 series / RX 7x000 XT/XTX and not able to get our hands on discounted previous gen
From reports I’ve been reading AMD and Nvidia have stopped making them when they cut their orders substantially at TSMC and Samsung over the months, there is just that much extra stock out there. I recall reading from a Korean site last year that silicon from AMD and Nvidia wasn’t the limiting factor in Board production and that TSMC and Samsung had doubled down to supply them, it was other board components be it memory, voltage regulators, capacitors, … , that held up production.
At the time I glanced over it and wrote it off because it was all stuff made in Korea that was in good supply while it was Chinese components that weren’t being made fast enough so I thought it was the typical anti China stuff that gets published.
But if there was some truth to it then AIB’s could be sitting on upwards of a 6 month supply of GPU’s where AMD or Nvidia could just stop making them chips and they would be fine.
This sort of tracks with TSMC and Samsungs statements on why they have had their orders slashed. So maybe I shouldn’t have written that article off at that time.
 
From consumer point of view, they need to vote with their wallets (eg: buy a used 3060 12gb/3060ti/6700xt or discounted 6800xt or a series S/PS5 or a 7900xt+/4080+)

Buying used will tend to be financing the new card purchase of the sellers, and obviously I am not sure buying a high-end model would teach them a lesson, becoming a console gamer certainly could.
 
Buying used will tend to be financing the new card purchase of the sellers, and obviously I am not sure buying a high-end model would teach them a lesson, becoming a console gamer certainly could.
AMD makes more money from selling consoles than they do from selling GPUs to consumers, I'm not sure that teaches them anything other than their strategy works.
 
Yes, both AMD & Nvidia have responsibilities to their share holders & board partners. So they can't price too low.
They can and they will. The reason they aren't pricing low now is because AMD has other sources of revenue for their GPU hardware besides PC, and Nvidia is preaching about AI. We know sales are down, but as long as AMD and Nvidia beat expectations then the shareholders will stick around.
Intel's dgpu business is still at risk if they cannot turn profitable
Intel won't leave the GPU business because it can't turn a profit. They realize they need good graphics to compete against AMD and Apple, as well as Nvidia in the server market. They only reason they make graphic cards is because they need beta testers. Also Intel isn't the only other one who's looking to compete in the GPU business.

Radeon would persist in some form or the other because it is in playstation & xbox.
Yep but AMD still has to pretend to care in the graphics card market.
From consumer point of view, they need to vote with their wallets (eg: buy a used 3060 12gb/3060ti/6700xt or discounted 6800xt or a series S/PS5 or a 7900xt+/4080+)
Yes but customers right now just buy Nvidia unanimously. Buying a PS5/Xbox is just enforcing AMD's lack of interest in the PC graphics card market. Buying used just means AMD and Nvidia will only release high end $1k+ GPU's and then the poor buying older generation where they have no updated drivers or warranty. What we need is competition from Intel and buy those. You buy nothing and suddenly you don't matter. You buy AMD then Nvidia will just make a faster GPU for $1,600. You buy a PS5 or Xbox and you'll make AMD happy and they'll continue to sell you $1k XTX graphic cards. Buy Intel then AMD and Nvidia will realize they're losing potential sales and will have to actually care.
 
Last edited:
Intel won't leave the GPU business because it can't turn a profit. They realize they need good graphics to compete against AMD and Apple, as well as Nvidia in the server market. They only reason they make graphic cards is because they need beta testers. Also Intel isn't the only other one who's looking to compete in the GPU business.
This right here, GPUs are the hill Intel will die on. The workstation, server, and supercomputer market only get Intel so far with no viable GPU. For basic compute tasks Intel has more than enough CPU options, yeah you can go back and forth on Intel vs AMD there for aspects but in reality, they are pretty minor differences. But more GPUs get sold for workstations, servers, and especially supercomputers now than CPUs do. The latest supercomputers which used to be Intel's bread and butter, now have 6 physical GPUs for every 1 physical CPU, Intel doesn't have an option to not compete for that market, servers and workstations aren't going to get any better either especially as specialized compute becomes more of a thing. Intel is not in a position to ignore the GPU market anymore, they absolutely must compete here or are dooming themselves to irrelevance.

Yes but customers right now just buy Nvidia unanimously. Buying a PS5/Xbox is just enforcing AMD's lack of interest in the PC graphics card market. Buying used just means AMD and Nvidia will only release high end $1k+ GPU's and then the poor buying older generation where they have no updated drivers or warranty. What we need is competition from Intel and buy those. You buy nothing and suddenly you don't matter. You buy AMD and Nvidia will just make a faster GPU for $1,600. You buy a PS5 or Xbox and you'll make AMD happy and they'll continue to sell you $1k XTX graphic cards. Buy Intel then AMD and Nvidia will realize they're losing potential sales and will have to actually care.
In defense of buying Nvidia, they sell a more complete product especially if you do things with your PC other than just game, AMD splitting their RDNA and CDNA architectures has some benefits sure but also drawbacks if you need a single box to do your work from, also CUDA... I hate that it is so good.
But many of the things that make game physics good are completely absent or severely anemic on AMD cards so games just don't get them anymore, AMD moved those over to CDNA so as I stated in another thread you have edge cases where a 1080TI will outperform a 6800xt for things like particle effects, and potentially tie it for reflection and refraction calculations for rasterized lighting effects.

I would love Intel or Nvidia to get a console contract next refresh, or both, a Microsoft/Intel console with a Sony/Nvidia one would be an interesting time for sure and it would force AMD to compete again as I find their existing GPU launches lazy. And it could be argued that a lazy launch from AMD is just as bad if not worse than Nvidia's current overpriced obviously short-lived lineup. But then again I don't see 1080p going away any time soon and by the time the whole sub 12GB problem hits we will be on the RTX 6000 series and RX 9000 series and that will be the same as people complaining about their existing GT1000 series cards not holding up, so sort of moot.

Bah! I feel like both AMD and Nvidia phoned this consumer generation in, and it's frustrating. Server-side things are really cool, lots of neat stuff happening there but that is a whole topic in itself.
 
Anything involving the number 4 & Nvidia has always been garbage, with the exception of the nForce 4 motherboard chipset.
Geforce 4 blasted ATI's offerings at the time. And had longevity in terms of usable performance. Trying to say otherwise, is revisionist.

----------------------------

$450 for a 4060ti is less than I thought. And considering the 3060ti still goes for that much-----is the first somewhat decent pricing for Ada Lovelace.
 
Geforce 4 blasted ATI's offerings at the time. And had longevity in terms of usable performance. Trying to say otherwise, is revisionist.
I remember the ATi 9000 series being very competitive, especially the 9800 Pro. I had a Geforce 4 ti 4200 though.
 
I remember the ATi 9000 series being very competitive, especially the 9800 Pro. I had a Geforce 4 ti 4200 though.
Oh yeah, The 9700 pro conquered all in the Fall of that year(02) But up until the fall it had been a Geforce 3/4 world.
 
But many of the things that make game physics good are completely absent or severely anemic on AMD cards so games just don't get them anymore, AMD moved those over to CDNA so as I stated in another thread you have edge cases where a 1080TI will outperform a 6800xt for things like particle effects, and potentially tie it for reflection and refraction calculations for rasterized lighting effects.
I would like to see evidence of this.
I would love Intel or Nvidia to get a console contract next refresh, or both, a Microsoft/Intel console with a Sony/Nvidia one would be an interesting time for sure and it would force AMD to compete again as I find their existing GPU launches lazy.
I believe that Sony and Microsoft funded RNDA2's development. It just seemed like AMD didn't want to sell RDNA2 GPU's for anything but high end graphic cards for a while to give PS5/Xbox the edge in graphics for mainstream buyers. Just my tin foil hat theory.
And it could be argued that a lazy launch from AMD is just as bad if not worse than Nvidia's current overpriced obviously short-lived lineup.
I think it goes together. Considering how much better Intel is at Ray-Tracing and this is their first graphic cards, it does seem like AMD is too lazy in improving it. It's really lucky for AMD that no game still requires Ray-Tracing hardware, because that day will come.
But then again I don't see 1080p going away any time soon and by the time the whole sub 12GB problem hits we will be on the RTX 6000 series and RX 9000 series and that will be the same as people complaining about their existing GT1000 series cards not holding up, so sort of moot.
The VRAM issue is because we're starting to see games make use of the PS5 and Xbox Series S's hardware, which means more VRAM. It's also because these games are terribly optimized for PC.
Bah! I feel like both AMD and Nvidia phoned this consumer generation in, and it's frustrating. Server-side things are really cool, lots of neat stuff happening there but that is a whole topic in itself.
I'm more interested in seeing how shareholders react to a loss in GPU sales for Nvidia. GPU sales for AMD doesn't matter since they were always terrible, but their CPU sales might show up as a problem with their crazy motherboard prices. AMD's and Nvidia's lack of care this generation is more their problem than consumer. I'm surprised how well my Vega 56 is holding up with current generation games at 1080P, and I play games on Linux so I take a good 10% loss in performance in most games. I'm still not compelled to upgrade.
Geforce 4 blasted ATI's offerings at the time. And had longevity in terms of usable performance. Trying to say otherwise, is revisionist.
I don't think you remember much of that time period. The Radeon 8500 was worse in every way to the Geforce 4 Ti, until driver bugs were found and fixed. John Carmack found so many bugs in ATI's drivers for the 8500 that by the time the Radeon 9000 was out, the 8500 was faster. So ATI of course rebranded the 8500 as the 9200, and clever people like myself would flash the firmware of their 8500 to the 9200 to get the benefits of new features and better performance. When the Geforce FX was relevant, a lot of new games required DX8.1 minimum, which the Radeon 8500 was capable of using, but not the Geforce 4 Ti's. Even with the superior performance of the Geforce 4 Ti's, it doesn't matter if you can't run the games.
 
Latest Rumour:

As per a report by DigiTimes (via MyDrivers), it is stated that NVIDIA is heavily considering making the pricing of its upcoming GeForce RTX 4060 Ti graphics card more compelling to gamers and end users. The reason for this might be due to the poor general public response to the RTX 4070 Ti and RTX 4070 graphics cards.


https://wccftech.com/nvidia-geforce...-us-same-pricing-3060-ti-3070-ti-performance/


according to the latest report from Digtimes, in order to ensure that the RTX 4060 Ti released next month can sell more goods, suppliers are currently actively stocking up.

The report mentioned that NV may be more friendly to the pricing of this product than the previous products. In terms of its target performance, it can beat the previous generation RTX 3070, comparable to the RTX 3070 Ti, and provide higher energy efficiency. , and new technologies such as DLSS 3.

For the first time, the RTX 4060 Ti adopts the AD106 core, integrates 4352 stream processors, only has a 128-bit memory interface, and is equipped with 8GB GDDR6 (the new generation uses GDDR6 for the first time), with an equivalent frequency of 18GHz and a bandwidth of 288GB/s.

As for the price that everyone cares about, it is said that NVIDIA is targeting a market of around US$449, which is US$150 cheaper than RTX 4070 (and US$350 lower than RTX 4070 Ti), and the National Bank product is expected to be around 3,500 yuan.

However, sources said that the RTX 4060 Ti and RTX 3060 Ti are priced at $399
 
For the first time, the RTX 4060 Ti adopts the AD106 core, integrates 4352 stream processors, only has a 128-bit memory interface, and is equipped with 8GB GDDR6 (the new generation uses GDDR6 for the first time), with an equivalent frequency of 18GHz and a bandwidth of 288GB/s.
It has the same amount of VRAM as my Vega 56?
As for the price that everyone cares about, it is said that NVIDIA is targeting a market of around US$449, which is US$150 cheaper than RTX 4070 (and US$350 lower than RTX 4070 Ti), and the National Bank product is expected to be around 3,500 yuan.
That still isn't cheap.
https://gpu.userbenchmark.com/Faq/What-is-the-Gravity-NBody-GPU-benchmark/88
https://gpu.userbenchmark.com/Compare/Nvidia-GTX-1080-Ti-vs-AMD-RX-6800-XT/3918vs4089

There is a lot of stuff out there on the topic and advantages Turing has over much of what came later for compute performance, it's why it was so coveted for mining for so long.
Really, Userbenchmark? For shits and giggles I ran the benchmark on my Vega 56 using Linux Mint and I can't tell how my PC did. How does this show the Nvidia cards have some sort of hidden advantage that applies to gaming?
https://www.userbenchmark.com/UserRun/60972678
 
Latest Rumour:

As per a report by DigiTimes (via MyDrivers), it is stated that NVIDIA is heavily considering making the pricing of its upcoming GeForce RTX 4060 Ti graphics card more compelling to gamers and end users. The reason for this might be due to the poor general public response to the RTX 4070 Ti and RTX 4070 graphics cards.


https://wccftech.com/nvidia-geforce...-us-same-pricing-3060-ti-3070-ti-performance/


according to the latest report from Digtimes, in order to ensure that the RTX 4060 Ti released next month can sell more goods, suppliers are currently actively stocking up.

The report mentioned that NV may be more friendly to the pricing of this product than the previous products. In terms of its target performance, it can beat the previous generation RTX 3070, comparable to the RTX 3070 Ti, and provide higher energy efficiency. , and new technologies such as DLSS 3.

For the first time, the RTX 4060 Ti adopts the AD106 core, integrates 4352 stream processors, only has a 128-bit memory interface, and is equipped with 8GB GDDR6 (the new generation uses GDDR6 for the first time), with an equivalent frequency of 18GHz and a bandwidth of 288GB/s.

As for the price that everyone cares about, it is said that NVIDIA is targeting a market of around US$449, which is US$150 cheaper than RTX 4070 (and US$350 lower than RTX 4070 Ti), and the National Bank product is expected to be around 3,500 yuan.

However, sources said that the RTX 4060 Ti and RTX 3060 Ti are priced at $399
There were/are rumours circulating that TSMC is giving Nvidia a deal for the AD106-360-A1 dies. Apparently TSMC is facing pressure from investors because of all the cancelled orders and wants to stay ahead of Intel from their upcoming 3’rd party fabs.

Nvidia also needs more DLSS 3.0 hardware in circulation to spur its adoption with developers.

So $399 is looking to be more and more a reality as well as the possibility of game bundles along side them.
 
So $399 is looking to be more and more a reality as well as the possibility of game bundles along side them.
Chance for Nvidia to do a $400 8gb 4060ti & a $500 16gb 4060ti

(AMD can follow suit with a $300-$330 8gb 7600xt & a $380-$400 16gb 7600xt)
 
Chance for Nvidia to do a $400 8gb 4060ti & a $500 16gb 4060ti

(AMD can follow suit with a $300-$330 8gb 7600xt & a $380-$400 16gb 7600xt)
Big IF, on that. AMD pulled a lot of fab time at TSMC too, they might be happier selling off the 6000 series parts.
I mean they have enough that XFX is launching a new lineup of them to match the aesthetics of the current 7900 parts.
 
Latest Rumour:

As per a report by DigiTimes (via MyDrivers), it is stated that NVIDIA is heavily considering making the pricing of its upcoming GeForce RTX 4060 Ti graphics card more compelling to gamers and end users. The reason for this might be due to the poor general public response to the RTX 4070 Ti and RTX 4070 graphics cards.


https://wccftech.com/nvidia-geforce...-us-same-pricing-3060-ti-3070-ti-performance/


according to the latest report from Digtimes, in order to ensure that the RTX 4060 Ti released next month can sell more goods, suppliers are currently actively stocking up.

The report mentioned that NV may be more friendly to the pricing of this product than the previous products. In terms of its target performance, it can beat the previous generation RTX 3070, comparable to the RTX 3070 Ti, and provide higher energy efficiency. , and new technologies such as DLSS 3.
Man remember when next gen cards were 1.5 tiers maybe grasping at 2 tier advancements over prior cards... well at least that's what that one meme everyone has been posting in the 4070 threads has been saying. Now they're content with 0.5 to 1 tier improvements, we already saw it with the 4070 as it's equivalent to the 3080... maybe.

Also remember when ti variants came later, except the 3060ti, after the tiers were already set and they were able to release an over clocked/optimized version and charge more money, now they want to start with the ti-variant so they can charge more money up front.
 
Man remember when next gen cards were 1.5 tiers maybe grasping at 2 tier advancements over prior cards... well at least that's what that one meme everyone has been posting in the 4070 threads has been saying. Now they're content with 0.5 to 1 tier improvements, we already saw it with the 4070 as it's equivalent to the 3080... maybe.

Also remember when ti variants came later, except the 3060ti, after the tiers were already set and they were able to release an over clocked/optimized version and charge more money, now they want to start with the ti-variant so they can charge more money up front.
I don’t know if content is the right word…

But if you look at density gains from past node improvements you see a pattern.

From 150nm down to the 22nm process you would see large improvements in density while also seeing a decrease in cost. So dies could both get larger and far more dense while staying the same cost.
From 22nm to 7nm we saw density increases but costs increased proportionally to the density so we see transistor count increases but the dies stay roughly the same size.
Since 7nm we see much smaller density improvements and cost increases that match but usually exceed the density increase.

We’ve reached or are at least rapidly approaching a point where the technology can’t keep up with the demands placed on it.
It’s becoming increasingly clear there needs to be a shift in how graphics are rendered or how GPU’s are designed at a fundamental level.

TSMC’s costs for 3 and beyond only get worse for density and size, and N4 is already eye watering.

Be the solution “chiplets”, MCM, or a new rendering pipeline I don’t know. But something has to happen because what’s going on now won’t work.
 
I don’t know if content is the right word…
Fine instead of "content" how about "this is what we're giving you, the consumer, and you're going to smile and take it!"

That said, if the "4080-12GB" was instead released as the 4070 instead of the 4070ti (which they had to do in order to sell it at anywhere near where they were wanting), and the "4070" was the 4060ti (because I'll allow this naming because the 30 series did it too) then sure we get our 1.5 to 2 tiers of generational improvement.
 
Fine instead of "content" how about "this is what we're giving you, the consumer, and you're going to smile and take it!"

That said, if the "4080-12GB" was instead released as the 4070 instead of the 4070ti (which they had to do in order to sell it at anywhere near where they were wanting), and the "4070" was the 4060ti (because I'll allow this naming because the 30 series did it too) then sure we get our 1.5 to 2 tiers of generational improvement.
Yes… we are very much paying for a dominatrix at this point.
We are are all just dirty dirty gamers and we know we deserve this.

I’m not getting caught up on the names because regardless of what they are called their price and performance don’t match what they should for the money.
Cause for the cards to cost what they are is irrelevant, the new series cost too much and deliver too little. The naming just gives us something semantic to quibble over.
 
I remember the ATi 9000 series being very competitive, especially the 9800 Pro. I had a Geforce 4 ti 4200 though.
Geforce 4 and Radeon 9000 series are different generations, released at very different times.

Radeon 8500 and Geforce 4 were in competition.

Whereas Geforce 5 and Radeon 9000 were the properly postioned, competing products. and Geforce 5 (FX series) is where Nvidia messed up badly. Geforce 6 was a massive correction.

I would like to see evidence of this.

I believe that Sony and Microsoft funded RNDA2's development. It just seemed like AMD didn't want to sell RDNA2 GPU's for anything but high end graphic cards for a while to give PS5/Xbox the edge in graphics for mainstream buyers. Just my tin foil hat theory.

I think it goes together. Considering how much better Intel is at Ray-Tracing and this is their first graphic cards, it does seem like AMD is too lazy in improving it. It's really lucky for AMD that no game still requires Ray-Tracing hardware, because that day will come.

The VRAM issue is because we're starting to see games make use of the PS5 and Xbox Series S's hardware, which means more VRAM. It's also because these games are terribly optimized for PC.

I'm more interested in seeing how shareholders react to a loss in GPU sales for Nvidia. GPU sales for AMD doesn't matter since they were always terrible, but their CPU sales might show up as a problem with their crazy motherboard prices. AMD's and Nvidia's lack of care this generation is more their problem than consumer. I'm surprised how well my Vega 56 is holding up with current generation games at 1080P, and I play games on Linux so I take a good 10% loss in performance in most games. I'm still not compelled to upgrade.

I don't think you remember much of that time period. The Radeon 8500 was worse in every way to the Geforce 4 Ti, until driver bugs were found and fixed. John Carmack found so many bugs in ATI's drivers for the 8500 that by the time the Radeon 9000 was out, the 8500 was faster. So ATI of course rebranded the 8500 as the 9200, and clever people like myself would flash the firmware of their 8500 to the 9200 to get the benefits of new features and better performance. When the Geforce FX was relevant, a lot of new games required DX8.1 minimum, which the Radeon 8500 was capable of using, but not the Geforce 4 Ti's. Even with the superior performance of the Geforce 4 Ti's, it doesn't matter if you can't run the games.
There weren't any DX 8.1 games which wouldn't run on Gerforce 4. You would simply be missing a graphical feature. So, Max Payne 2 would be missing the shiny/sheen effect, on character models. Morrowind's water didn't have the really nice, shiny ripples and rain drop effects. However, Geforce 4 still delivered better framerates. And I don't remember many 8.1 games, really.

No doubt, Radeon 8500 was great card. And my first GPU, was a Radeon 9000 pro. Which was basically a re-badged 8500, with less VRAM. I eventually got a Geforce 4 Ti 4400 and that carried me, until I got a PCI-E Geforce 6800.
 
Last edited:
I believe that Sony and Microsoft funded RNDA2's development. It just seemed like AMD didn't want to sell RDNA2 GPU's for anything but high end graphic cards for a while to give PS5/Xbox the edge in graphics for mainstream buyers. Just my tin foil hat theory.
Not so tinfoil hatty at all, it was a timed exclusivity thing which is why RDNA 2 never really got pushed out to an APU and also why AMD physically disabled the GPU silicon in their Cardinal packages.
I think it goes together. Considering how much better Intel is at Ray-Tracing and this is their first graphic cards, it does seem like AMD is too lazy in improving it. It's really lucky for AMD that no game still requires Ray-Tracing hardware, because that day will come.
This is a fun issue, Intel and Nvidia do incredibly well at reduced precision 8-bit and 4-bit calculations, AMD removed those functions from their silicon a long time ago and only really do 16-bit and up.
AI, ray-tracing, and a lot of the newer Machine Learning algorithms thrive in sub-16-bit, this leaves AMD playing catchup while doing what they must to dodge patents, AMD not having the 8 and 4-bit calculations on their hardware is why they do so very poorly in AI-related tasks across the board and sadly (or fortunately, depending on how you look at it), it leaves their hardware completely incompatible with GPT 3, 3.5, 4 and the other Open AI projects. But that let them put that silicon to use for the higher complexity stuff and it really gives AMD the edge in dealing with more complex simulations. Sadly though AI is starting to encroach on those simulation environments because what they do with brute force "AI" is able to approximate itself to a very similar if not identical result. So AMD has to play catchup there but I am sure they are up to the challenge which will help in their future ray-tracing implementations as they get more experience with it.
The VRAM issue is because we're starting to see games make use of the PS5 and Xbox Series S's hardware, which means more VRAM. It's also because these games are terribly optimized for PC.
So terribly ported, but the PS5 and Xbox do some really complicated things with how they manage textures and memory pointers which are fundamentally incompatible with PCs as a whole despite the graphics and CPU architectures being the same as on a PC how they interact with memory is so fundamentally different porting is surprisingly hard. This is going to become a problem for the next couple of years, I fully expect developers not to bother with cleaning up and optimizing for the PC in most cases, and they will instead balloon the absolute crap out of RAM, VRAM, and storage installs. Why will they spend a few million optimizing for PC when they can shift those costs to us and require us to instead buy more ram, uninstall a different game, play on lower settings (which will still look better than the console), or buy bigger/faster GPUs?
I'm more interested in seeing how shareholders react to a loss in GPU sales for Nvidia. GPU sales for AMD doesn't matter since they were always terrible, but their CPU sales might show up as a problem with their crazy motherboard prices. AMD's and Nvidia's lack of care this generation is more their problem than consumer. I'm surprised how well my Vega 56 is holding up with current generation games at 1080P, and I play games on Linux so I take a good 10% loss in performance in most games. I'm still not compelled to upgrade.
AI sales are inflating their numbers, but Shareholders are angry at Nvidia's smaller margins. Despite all this shit pricing that is too high, and revenue remaining relatively flat (even after the boost by AI stuff), Nvidia's margins are down to where they were back in 2016, Nvidia says they are eating a lot of costs and not passing them down the channel because their board partners couldn't handle it while still managing to meet their needs. Nvidia might be a bully but they do know they can't exist without their AIB, so they aren't blind to their serious plights just indifferent to the small ones.
1080P is why AMD and Nvidia are having such a hard time, you can play at that resolution very well for very little money and FSR/DLSS have only served to extend the lifespan of cards they would have long hoped were replaced.
 
Anything involving the number 4 & Nvidia has always been garbage, with the exception of the nForce 4 motherboard chipset.

I went through 5(?) nForce 4 mobos for two AMD s939 systems because they kept dying after 2 or 3 years of use. Both were eventually replaced with intel LGA1366 when getting replacement parts to keep them running became impossible.
 
Back
Top