AMD CTO claims to have completed AI enablement of entire portfolio and 2024 is the deployment year

Marees

2[H]4U
Joined
Sep 28, 2018
Messages
2,095
Tbh, I couldn't make head or tails out of this statement

Hope someone else on the forum here can clarify

It appears that AMD may be exploring using AI-algorithms for their upscaling as well, as hinted by Mark Papermaster (AMD CTO):

2024 is a giant year for us because we spent so many years in our hardware and software capabilities for AI. We have just completed AI enabling our entire portfolio, so you know cloud, edge, PCs and our embedded devices and gaming devices. We are enabling our gaming devices to upscale using AI and 2024 is really a huge deployment year. So now the bedrocks there, the capabilities are there. I have talked to you about all the partners that. So 2024 is for us a huge deployment. […]
— Mark Papermaster, AMD CTO
As per Papermaster’s remarks, there’s a possibility that the company is considering integrating AI into the upscaling.

https://videocardz.com/newz/amd-exec-hints-at-ai-powered-upscaling


View: https://www.youtube.com/watch?v=EtqTnLoiXUo&t=2142s
 
I think this is in reference to the AMD Ryzen AI engine they had exclusive to the 7040U series mobile chips. Which is AMD’s version on the Intel NPU.
 
IDK, kinda sounds like it's just AMD marketing speak and people are trying to turn it into news. I'm sure they've been working on enhancing FSR with AI for a while now.
 
IDK, kinda sounds like it's just AMD marketing speak and people are trying to turn it into news. I'm sure they've been working on enhancing FSR with AI for a while now.
No, the current AMD hardware does abysmally bad at AI acceleration.
AMD is just picking and choosing between 2 or 3 common upscaling methods found in TV’s but instead of doing it on the whole scene they do it on an object by object basis as the scene is being drawn. For the frame generation they again are using an off the shelf transition between frames 1 and 3 to generate 2.
 
The first glance of the article is a bit strange.

AMD CTO say:
We are enabling our gaming devices to upscale using AI and 2024 is really a huge deployment year....

Than speculation of what it could mean ?
As per Papermaster’s remarks, there’s a possibility that the company is considering integrating AI into the upscaling.

Possibility ? Sound quite straight forward. Sound to me RDNA 4 that launch this year will have an gaming upscaling that use machine learning.
No, the current AMD hardware does abysmally bad at AI acceleration.

The latest DLSS upscaling work on a RTX 2060, which I am not sure will have 25% the FP-16 inference power of a 5060.

Current AMD hardware do beat a RTX 2060 in many AI inference task like stable diffusion, a 7800xt match a RTX 2060 non super for example under some diffusion benchmark, an 7900xtx can match a 3070, they are far behind but considering we are talking about something that work well enough on a 6 years old RTX 2060 hardware and would be made with their hardware strength and limitation in mind, seem possible, you can be abysmally bad at AI acceleration in 2024 and still achieve 2018 performance.
 
Last edited:
The first glance of the article is a bit strange.

AMD CTO say:
We are enabling our gaming devices to upscale using AI and 2024 is really a huge deployment year....

Than speculation of what it could mean ?
As per Papermaster’s remarks, there’s a possibility that the company is considering integrating AI into the upscaling.

Possibility ? Sound quite straight forward. Sound to me RDNA 4 that launch this year will have an gaming upscaling that use machine learning.


The latest DLSS upscaling work on a RTX 2060, which I am not sure will have 25% the FP-16 inference power of a 5060.

Current AMD hardware do beat a RTX 2060 in many AI inference task like stable diffusion, a 7800xt match a RTX 2060 non super for example under some diffusion benchmark, an 7900xtx can match a 3070, they are far behind but considering we are talking about something that work well enough on a 6 years old RTX 2060 hardware and would be made with their hardware strength and limitation in mind, seem possible, you can be abysmally bad at AI acceleration in 2024 and still achieve is possible to do on 2018 hardware.
I thought he was referring to training the AIs which is different. But fair enough.

Just as an aside though the RTX 2000 series doesn’t do the upscaling with FP16 it does it with Int8.
 
Last edited:
I thought he was referring to training the AIs which is different. But fair enough.
I can see them using their MI-300 type to do the training, buying a competitor (even non-Nvidia solution) for it would feel strange, but the consumer gaming device enabled to upscale using AI is 100% inference I think, the gaming device never do the training part.

If it is a ML system that help to create the next version of the FSR algo (like giving you rapidly a score for your code change) and you make one that still does not use a black box at the end but used AI during dev time, that would be brilliantly phrased to be misleading here.
 
If it is a ML system that help to create the next version of the FSR algo (like giving you rapidly a score for your code change) and you make one that still does not use a black box at the end but used AI during dev time, that would be brilliantly phrased to be misleading here.
That’s now Nvidia does their algorithms, but in retrospect I think they are talking about having their AI accelerators on all their upcoming CPU releases this year.
 
No, the current AMD hardware does abysmally bad at AI acceleration.
AMD is just picking and choosing between 2 or 3 common upscaling methods found in TV’s but instead of doing it on the whole scene they do it on an object by object basis as the scene is being drawn. For the frame generation they again are using an off the shelf transition between frames 1 and 3 to generate 2.

I never said their current stuff uses AI, I said they have probably been working on a version that does.
 
That’s now Nvidia does their algorithms, but in retrospect I think they are talking about having their AI accelerators on all their upcoming CPU releases this year.
I am not sure if the sentence:
we're we're enabling our our gaming devices to to upscale using AI

Need much analysis, maybe he misspoke or I am missing something, but it seems to simply clearly mean having ML involved in a FSR type gaming upscaling, like Intel-Nvidia.
 
I am not sure if the sentence:
we're we're enabling our our gaming devices to to upscale using AI

Need much analysis, maybe he misspoke or I am missing something, but it seems to simply clearly mean having ML involved in a FSR type gaming upscaling, like Intel-Nvidia.
It is possible that he mis-spoke but just taking him literally:

Enabled (past tense) & ready to be deployed (future tense)

I think he is referring to APU / NPU (from the times of the OG steamdeck LCD) already having AI hardware to be utilized via win 12 rollout later this year
 
It could mean they are working on a software solution, or they might be building hardware into the next family of GPU's. Nvidia's GPU's have had the AI capable cores (Tensor ones I think) since 2xxx family.
So AMD is copying DLSS more closely, assuming he is talking about gaming purposes. He does say gaming.
Unless there is something else that Nvidia is doing with AI and Gaming that goes beyond DLSS and Frame Gen.
 
"[...]We have spent so many years developing our hardware and software capabilities for AI; we've just completed AI enabling our entire portfolio -- So, cloud, edge, uh ya'know, our PCs, our embedded devices, our gaming devices. We, we're enabling our gaming devices to upscale using AI. Uh, in 2024[...]"
 
It means all of their CPUs/APUs have NPUs.
 
I never said their current stuff uses AI, I said they have probably been working on a version that does.
yeah you didn't and I am just confused, no coffee for me for the next 3 weeks, and caffeine withdrawal is an actual thing.
 
I am not sure if the sentence:
we're we're enabling our our gaming devices to to upscale using AI

Need much analysis, maybe he misspoke or I am missing something, but it seems to simply clearly mean having ML involved in a FSR type gaming upscaling, like Intel-Nvidia.
My brain is just addled and confuzzled.
 
literally asked Bing AI to write a press release about an imaginary company integrating AI into its products. no description of its products:

"ARD, a trailblazing technology company, is proud to announce a groundbreaking milestone: the seamless integration of artificial intelligence (AI) into its entire product ecosystem. This strategic move positions ARD at the forefront of innovation, enhancing user experiences and driving unparalleled efficiency."

It's buzzwords. It's all just buzzwords. It means nothing.
 
To me, it totally sounds like this:
If it is most of the world will be AI compute in a decade.

aws-q4-2017-revenue-income.jpg


Between that video and now Cloud computing revenu multipled by around 180 time or 18000% at amazon, microsoft as well, the joke that HP would fail in that market like many trying to not left behind was spot on too.
 
Last edited:
Between that video and now Cloud computing revenu multipled by around 180 at amazon, the joke that HP would fail in that market was spot on too.
My point (and the video's) wasn't that cloud was bullshit it was about HP not having any cloud product and talking buzzwords. This AMD shit sounds the same. It sounds like "We will totally have that AI thing that everyone is talking about and it will be the best AI and will do all of the things that people like to do with AI!"
 
My point (and the video's) wasn't that cloud was bullshit it was about HP not having any cloud product and talking buzzwords. This AMD shit sounds the same. It sounds like "We will totally have that AI thing that everyone is talking about and it will be the best AI and will do all of the things that people like to do with AI!"
HP Cloud had a full actual stack of cloud services, they just failed (AWS pricing was like a third of HP), I think part of the joke back then was thinking that cloud unlike PC-Internet-Mobile-Social Media would be a fad or just a buzzword, it ended up being one of the biggest business in the world.

AMD has actual capable ML hardware and software product, among all the AI talking business, AMD, Nvidia are 2 that has serious products here. The term are a little buzzy.

Edge AI solution is just inside entreprise controlled computing, could is their MI300 products and EPYC cpus used for training-infering from ML models, it is said in key buzzwords way, but there is an existing products being sold for each of them I think.
 
If it is most of the world will be AI compute in a decade.

View attachment 639510

Between that video and now Cloud computing revenu multipled by around 180 time or 18000% at amazon, the joke that HP would fail in that market was spot on too.
The speed at which localized computing increases is decreasing. CPU and GPU year-over-year performance increases are dwindling, and the increases we get are more than not because of the transition to a more costly processing node, the increase in power draw, or likely both.
Things can't continue that way, migrating the computing platforms and the programming practices over to something that is both platform and content-aware is necessary for better efficiency.
For decades developers have been able to say, well they can get more ram, or they can get a faster machine, now that's not feasible for most.
Now it is time to get the APIs in place to let the platform itself decide how to best process the instructions, but to do that we at least need all the same functionality and interface on the hardware platforms.
An ARM v8 device (Apple or Android) can decode and upscale a stream from Netflix, Disney, Amazon, or any of the others on something like half a watt whereas it takes a PC 20x that to just decode and often isn't even capable of upscaling because the platform has some pretty nice adaptive algorithms for dealing with the custom AI or ML-based accelerators.
It's long past due for the PC market to catch the hell up.
 
HP Cloud had a full actual stack of cloud services, they just failed (AWS pricing was like a third of HP), I think part of the joke back then was thinking that cloud unlike PC-Internet-Mobile-Social Media would be a fad or just a buzzword, it ended up being one of the biggest business in the world.

AMD has actual capable ML hardware and software product, among all the AI talking business, AMD, Nvidia are 2 that has serious products here. The term are a little buzzy.

Edge AI solution is just inside entreprise controlled computing, could is their MI300 products and EPYC cpus used for training-infering from ML models, it is said in key buzzwords way, but there is an existing products being sold for each of them I think.
HP cloud has shifted, and the Greenlake platform is pretty good IF you require its services, I would use Greenlake before I used AWS for sure, but I don't have enough data to make it worth the initial investment, so I use Azure for an offsite off domain backup instead.
 
Until I next upgrade CPU or GPU, and there are specific products to consider, all this is just background noise for me. In super theory, I expect many/most software and lots of hardware products to be positively impacts by AI. But I'm a consumer, not an industry analyst or pundit.

Hope I haven't hurt anyone's feelings.
 
Until I next upgrade CPU or GPU, and there are specific products to consider, all this is just background noise for me
Compute using heavy floating point matrix computation and ML or not could continue to be background noise for users yes, the podcast referenced in the OP is one toward the VC, startup and enterprise (and people working for them).

GPU getting massively parallel work vs quite linear for CPU, can stay pure background noise, just sell me the computer that play game well enough at a good price.

Many people that buy a PS5 today does not know or care much there is a CPU and a GPU good at different thing (back in the days where you had computer without a 3d GPU and other with one it was a rather big deal), the specialized hardware for decompressing Kraken will not be a big talking point of the next one.

Same will occur, when a gamer on its PS6 will have a characther that the dialogue he hear its not an pre-recorded actor, but a generated voice with almost no lag speaking in the language of the gamers choice and with the age-gender-accent-tone-exhausted-injured or not sound of the voice being dynamic of the unpredictable condition of the npc speaking or your characther at first it will be called AI, if it uses a special part of the GPU or a NPU it will be a big deal, really fast it will just be taken for granted and what get called AI will continue to morph to be things computer are yet to be really good at.

Physic engine, sound engine that will use ML, NPC decision, generative AI of texture, dialogue, voice, face speaking and emotion deformation, trees, buildings and their parts, will not feel different from things like SpeedTree in the past that was doing something quite similar without ML from an user standpoint.
 
Last edited:
Compute using heavy floating point matrix computation and ML or not could continue to be background noise for users yes, the podcast referenced in the OP is one toward the VC, startup and enterprise (and people working for them).

GPU getting massively parallel work vs quite linear for CPU, can stay pure background noise, just sell me the computer that play game well enough at a good price.

Many people that buy a PS5 today does not know or care much there is a CPU and a GPU good at different thing (back in the days where you had computer without a 3d GPU and other with one it was a rather big deal), the specialized hardware for decompressing will not be a big talking point of the next one.

Same will occur, when a gamer on its PS6 will have a characther that the dialogue he hear its not an pre-recorded actor, but a generated voice with almost no lag speaking in the language of the gamers choice and with the age-gender-accent-tone-exhausted-injured or not sound of the voice being dynamic of the unpredictable condition of the npc speaking or your characther at first it will be called AI, if it uses a special part of the GPU or a NPU it will be a big deal, really fast it will just be taken for granted and AI will continue to morph to be things computer are yet to be really good at.

Physic engine, sound engine that will use ML, NPC decision, generative AI of texture, dialogue, voice, face speaking and emotion deformation, trees, buildings and their parts, will not feel different than things like SpeedTree in the past that was doing something quite similar without ML from an user standpoint.
There is also a lot of neat work going on currently on essentially building a front end to the actual CPU, GPU, NPU, etc available inside the devices.
No more specifically targeting system resources by calling a library to target a specific NPU, which becomes useless later if you switch from Nvidia to AMD, or AMD to Intel, and no more worrying about how many CPU cores are present, or if there is a high-end GPU at all. Trying to do away with the situation where you have upgraded to a newer CPU that has a new feature that is useless because your software doesn't use it, resulting in the "upgrade" being performance neutral or god forbid a performance decrease because the new CPU is lacking something else.
Sometimes being able to program as close to the metal as possible is great, but the reality is the PC space is far too vast and complex for that to be feasible outside of bespoke solutions a solid and standardized abstraction layer for the OS to interact with will be the next thing I am sure of it.
 
Last edited:
Interesting


Current-gen RDNA 3 has dedicated AI acceleration with Wave MMA (matrix multiply-accumulate) instructions, which can help improve AI-based performance and also benefits ray tracing instructions, similar to Nvidia's Tensor cores.


So AI supported FSR only for RDNA 3 & above 🤔

https://forums.tomshardware.com/threads/amd-confirms-it-is-working-on-an-ai-upscaler-for-gaming-–-cto-papermaster-says-its-part-of-ai-enabling-our-entire-portfolio.3838488/post-23215981
Maybe, maybe not. Could speculate to a DP4a fallback (like XeSS). With a further fallback to the non-AI method used in FSR 2/3. Providing a solution with an even broader reach.
 
Back
Top