Why Nvidia doesn't feel like it needs to care about pleasing Geforce customers anymore

They are, just not at the top end. Not everyone is a 4090 class buyer.
Not really. They're losing market share, throwing out a "7999.999XXXTXXX AWESOME FAST" card that competes with Nvidia's 4080 in raster and 4060 in RT.

They aren't leaving the high end by choice. no matter what they say. Profit per square-millimetre of silicon at the high-end is ridiculously high. They would compete there if they could.
 
Not really. They're losing market share, throwing out a "7999.999XXXTXXX AWESOME FAST" card that competes with Nvidia's 4080 in raster and 4060 in RT.

They aren't leaving the high end by choice. no matter what they say. Profit per square-millimetre of silicon at the high-end is ridiculously high. They would compete there if they could.
But right now? AMD can't. If they could whip up a 4090 killer they would have already done so. Agreed 100%.
 
Not really. They're losing market share, throwing out a "7999.999XXXTXXX AWESOME FAST" card that competes with Nvidia's 4080 in raster and 4060 in RT.

They aren't leaving the high end by choice. no matter what they say. Profit per square-millimetre of silicon at the high-end is ridiculously high. They would compete there if they could.

I wasn't really referring to market share but the actual products themselves. The products compete pretty well at their given price points (at least for raster performance which is the majority of gaming), but it doesn't matter because people tend to buy Nvidia anyways. I've seen people choose a GTX 1050 over an RX 580 in the past despite the RX 580 being way faster at a similar price. Nowdays though GPU performance is more about raster and AMD is quickly losing more and more ground in that aspect so if they don't make some major changes then yeah for sure they won't be able to compete at all.
 
But right now? AMD can't. If they could whip up a 4090 killer they would have already done so. Agreed 100%.
I think its a design choice, not incompetence. AD102 has 30% more transistors than Navi 31, and performs about 30% faster. AMD's wafer starts are even more strained than Nvidia's - they have to fab Epyc, Ryzen, Navi, and Instinct, parts with wildly different markets and margins, all at the same fab on the same node.
 
I wasn't really referring to market share but the actual products themselves. The products compete pretty well at their given price points (at least for raster performance which is the majority of gaming), but it doesn't matter because people tend to buy Nvidia anyways. I've seen people choose a GTX 1050 over an RX 580 in the past despite the RX 580 being way faster at a similar price. Nowdays though GPU performance is more about raster and AMD is quickly losing more and more ground in that aspect so if they don't make some major changes then yeah for sure they won't be able to compete at all.
I heard some interesting stats yesterday of a user survey about the market share of GPUs at Hardware Unboxed.

The median price of GPUs owned by survey respondents was $600. That means about half of the surveyed people bought cards that cost less than $600 and the other half bought video cards over $600.

In terms of ownership of the video cards under $600 AMD card ownership was at 48%. For the $600-800 bracket it was 57% Nvidia owners. And for the $800+ price it was 69% Nvidia owners.

So there you go. Take that for what you will but the objective data seems to show that AMD does not compete well at the high end but generally is quite competitive in the sub $600 market.

video:
View: https://youtu.be/yG_WjpFqWzs?si=glrZ4FsVUgeUItwp&t=1147
 
And yet AMD STILL can't compete..
They're targeting 4070 Super Ti - 4030 range which's more of a fight against Intel than Nvidia. Good thing Intel's struggling very bad at the moment and doesn't seem like they'll ever going back to the good ole' Sandy Bridge days again.
2c7e7478a0d9ead5975377e8c2c756e842996f5e37af746d93e46ba54bba0fa8.png
infographic__original.jpg (2).png
original_561195154.png
1a7be50f068f926ebd9833b2fc351808bed458c320012374c908726916c80aa3.png
 
The funny thing is, I remember buying the first graphics card branded by NVIDIA as having a GPU. GeForce 256 in 1999. I noticed the term GPU on the back and dismissed it as utter marketing nonsense. I just wanted more frames in Half-Life. Man if only I had a Delorean…
 
I also remember... I was like T&L sounds great... Same for the very basic per-pixel stuff it could do. In the end I was like "where's the games that support this?" and went about my business...
 
Back
Top