AMD GPU Sales Not That Far Behind NVIDIA's in Revenue Terms

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,785
Hot on the heels of NVIDIA's Financials, AMD's record would like to have a word

"While AMD Radeon PC discrete GPUs have a lot of catching up to do against NVIDIA GeForce products in terms of market-share, the two companies' quarterly revenue figures paint a very different picture. For Q4 2022, AMD pushed $1.644 billion in GPU products encompassing all its markets, namely the semicustom chips powering Xbox Series X/S and PlayStation 5 consoles; and AMD Radeon products. In the same period, NVIDIA raked in $1.831 billion in revenues from semicustom chips powering Nintendo Switch console, GeForce NOW cloud-gaming service, and NVIDIA GeForce products. In purely revenue terms, AMD is bringing in 89% the revenue of NVIDIA from client graphics IP, which begins to explain how AMD is a major player in this market."

Source: https://www.techpowerup.com/305118/amd-gpu-sales-not-that-far-behind-nvidias-in-revenue-terms
 
Last edited:
AMD doesnt have much on the professional side right? Sure a slew of super computer wins but so does nvidia.
 
  • Like
Reactions: erek
like this
Well if they keep eating into the server market they have a good foothold to bundle in AI / CDNA cores in some form
Yes but not really, Nvidia's tensor cores hands down crush anything AMD is currently offering in the server space, it's not close for anything AI-related.
For general compute and calculations AMD gets the win between the two for both power and performance, but AI is a solid win for Nvidia here and AMD I don't think was even trying to compete.
I mean ChatGPT which uses GPT-3.5 was trained over a year using 10,000 Nvidia GPUs for a reason, you would need something upwards of 23,000 of the AMD equivalent cards to reach that same level of performance in AI-related tasks.
The Triton project https://openai.com/blog/triton/
which is supposed to be the toolkit that replaces CUDA for AI, doesn't even support AMD GPUs at the moment.
"CPUs and AMD GPUs are not supported at the moment, but we welcome community contributions aimed at addressing this limitation."

Edit - Correction:
I based that 23,000 based on FP16, turns out you need FP4 and 8 for the modern AI engines, and the AMD cards currently don't do that at all, and when they do it is emulated, and the modern Nvidia cards are between 9 and 30x faster than AMD's current best making it something that is not at all possible on AMD's offerings.
 
Last edited:
Yes but not really, Nvidia's tensor cores hands down crush anything AMD is currently offering in the server space, it's not close for anything AI-related.
For general compute and calculations AMD gets the win between the two for both power and performance, but AI is a solid win for Nvidia here and AMD I don't think was even trying to compete.
I mean ChatGPT which uses GPT-3.5 was trained over a year using 10,000 Nvidia GPUs for a reason, you would need something upwards of 23,000 of the AMD equivalent cards to reach that same level of performance in AI-related tasks.
The Triton project https://openai.com/blog/triton/
which is supposed to be the toolkit that replaces CUDA for AI, doesn't even support AMD GPUs at the moment.
"CPUs and AMD GPUs are not supported at the moment, but we welcome community contributions aimed at addressing this limitation."
AMD added AI cores to the compute units in RDNA3 that are somewhat similar to the tensor cores that Nvidia uses. Obviously that means they're trying to play catch up right now but it also means they're at least trying to support that sort of thing and like Nvidia I doubt it's just or even mainly for the consumer side.
 
I do not understand what we are looking at, what if not GPU is driving almost all of Nvidia revenues

How does the CPU part of a console vs the GPU part of a console APU revenue is divided here ?

Isn't the compagnies gaming division and not GPU sales ?
 
AMD added AI cores to the compute units in RDNA3 that are somewhat similar to the tensor cores that Nvidia uses. Obviously that means they're trying to play catch up right now but it also means they're at least trying to support that sort of thing and like Nvidia I doubt it's just or even mainly for the consumer side.
But they have not added them to any of their available Datacenter offerings, their first generation of AI cores on their data center products will be on the MI300.
The only thing known about the MI300 and AI is AMD claims it is 8x faster than their MI250 in AI operations, but as it currently stands the MI250 doesn’t support native low-precision math below 16-bits, which is what basically all the current AI toolsets use.
Nvidia Hopper on the other hand is between 9 and 30x faster than the MI250x in those AI operations, and the 250X is marginally faster than the MI250, so AMD's next generation based on their slides looks to be significantly slower than Nvidia's current generation (in AI), while also not supporting any of the existing toolsets. And by the time the MI300 does actually launch Nvidia will be announcing their next generation of AI accelerators, which will obviously be faster than their current gen and support all the existing tools.
 
But they have not added them to any of their available Datacenter offerings, their first generation of AI cores on their data center products will be on the MI300.
The only thing known about the MI300 and AI is AMD claims it is 8x faster than their MI250 in AI operations, but as it currently stands the MI250 doesn’t support native low-precision math below 16-bits, which is what basically all the current AI toolsets use.
Nvidia Hopper on the other hand is between 9 and 30x faster than the MI250x in those AI operations, and the 250X is marginally faster than the MI250, so AMD's next generation based on their slides looks to be significantly slower than Nvidia's current generation (in AI), while also not supporting any of the existing toolsets. And by the time the MI300 does actually launch Nvidia will be announcing their next generation of AI accelerators, which will obviously be faster than their current gen and support all the existing tools.
I'm just saying that they're clearly moving in that direction with their hardware even if they're playing catch up right now.
 
I'm just saying that they're clearly moving in that direction with their hardware even if they're playing catch up right now.
They are starting, but I don't think it was on AMD's radar, this AI "craze" honestly hit them and a lot of other companies out of nowhere.
AMD is getting into this game a solid 6 years late, they have a lot of work to do, and I hope they are up for it.
Honestly, AMD's strengths here are in their pure compute, which makes them a powerhouse for lots of simulation and scientific calculations used by the energy sector and aerospace industries, AI there is new, and is only really used to take a design and its simulation results, make a minor tweak to it and re simulate it to see if the results are better or worse, it is very incremental and uses some of the much older machine learning models which don't make use of the Tensor cores nearly as much (if at all).
AMD there is still the vastly superior option and at some point AMD and Nvidia both need to look at what they are offering their cards for and say OK this does this.

The only OAM GPU's that do both compute and AI acceleration well enough are the Intel Ponte Vecchio accelerators, and while they can do both, neither are as fast as AMD and Nvidia, but it does give flexibility that the others don’t provide.
 
Last edited:
CDNA is professional side. No idea how much they sell tho.

You would be surprised. With the failure of most of the game streaming services though that will hurt sales going forward. CDNA powered quite a bit of that stuff. I'm sure AMD was looking forward to years of CNDA sales for stadia and the like. lol
 
Hmm. Looking at it from a client gaming segment perspective is that Nvidia still manages to stay ahead in revenues despite the significant design wins of AMD with Sony, MS and Valve.
 

Games consoles now deliver fully one quarter of AMD's revenues​


For the record, AMD generated a total of:
  • $6.8 billion revenue from gaming chips (includes GPUs, PS5, Xbox, Steam)
  • $6 billion from data center chips,
  • $6.2 billion from PC processors, and
  • $4.5 billion from embedded chips.

That's according to an official AMD filing (via Tom's Hardware). "One customer accounted for 16% of our consolidated net revenue for the year ended December 31, 2022. Sales to this customer consisted of sales of products from our Gaming segment," the filing revealed.
That customer can only really be Sony and the PS5, which comfortably outsells the Xbox Series consoles.

AMD hasn't detailed revenues specifically for Microsoft consoles, but based on those numbers, if 30 odd million console chips is 16% of AMD's business, another 20 odd million will be roughly 10% and combined you're looking at in the region of 25% of AMD's revenues comes from those gaming consoles. Throw in the Steam Deck, also packing a chip from the red team's semi-custom silicon division, and that's maybe a conservative estimate.

https://www.pcgamer.com/games-consoles-now-deliver-fully-one-quarter-of-amds-revenues/

Sravan Kundojjala, a semiconductor industry analyst, noted that if Xilinx results are excluded, Sony accounts for 20% of AMD's revenue, probably making it the company's largest customer in recent history.

based on softening demand for the PC and game console cycle, it looks like AMD's data center business will become the company's main source of revenue in 2023.

https://www.tomshardware.com/news/sony-becomes-largest-customer-of-amd
 
Back
Top