Nvidia is ‘No longer a graphics company’

DPI

[H]F Junkie
Joined
Apr 20, 2013
Messages
12,885
“Jensen sent out an email on Friday evening saying everything is going to deep learning, and that we were no longer a graphics company. By Monday morning, we were an AI company."

It’s no secret that Nvidia has quickly morphed into an AI company. Although it creates some of the best graphics cards for PC gamers, the company’s supercomputing efforts have catapulted it into being a trillion-dollar company, and that transformation was spurred on by the monumental rise of ChatGPT. That shift, from a graphics company to an AI company, was an intentional choice by Nvidia’s CEO Jensen Huang.

In a moment of saying the quiet part out loud, Greg Estes, the vice president of corporate marketing at Nvidia, said: “[Jensen] sent out an email on Friday evening saying everything is going to deep learning, and that we were no longer a graphics company. By Monday morning, we were an AI company. Literally, it was that fast.”

https://www.digitaltrends.com/computing/nvidia-said-no-longer-graphics-company/
https://www.newyorker.com/magazine/2023/12/04/how-jensen-huangs-nvidia-is-powering-the-ai-revolution
 
Last edited:
People think this means they are going to stop making graphic cards. They aren’t.
But you bet your ass they want to replace the raster pipe with something completely done via some form of DLSS FG.

Raster isn’t moving them to 4K playable any time soon and it costs a fortune to tweak drivers and work with developers to make it work well for anything they produce. They are betting on finding a way for a game and inputs to just tell the GPU what it wants to be happening and the GPU just renders what it thinks the stuff is asking for but at 60+ FPS.
 
Yep gamers will now get the second string parts from Nvidia. Maybe. If Intel ends up being real competition... or AMD actually releases some top tier stuff. Nvidia might just decide its not even worth putting out cast offs and secondary process silicon to actually compete. If there is even a slim possibility Nvidia looses a gaming crown Jensen will just say pull it imo. Better to exit king then be seen as inferior.

I imagine we get super versions of this gen. Then Nvidia is just going to skip next gen silicon in gaming cards. If they bother with gaming cards after that it will be their next chips "super" versions 1.5 years after it goes to data center... if that would keep NV at #1. If it won't keep Nvidia at #1 that silicon will just go to second string accelerators.... designed to fend off potential new cost competition in the AI market.

We better all cheer Intel now.
 
Rasterization is required to display anything 3d on a monitor. You would have to change display technology or else move the rasterizer if you wanted to remove that from the graphics card. And since the graphics card is the device which outputs to the display, it doesn't really make sense anywhere else.

Until we get real 3D displays (as in, not two 2d images finagled together to look 3d) or find another method of translating 3d scenes into 2d images, raster is here to stay. I'm sure someone is looking into this, but I've heard absolutely nothing about it.
 
Yeah, and 2 years ago they were no longer a graphics company. They were a crypto company.

It's going to take a little more to convince me this is anything more than the latest in a long string of trends of the moment.
100%, this is all just food for press. When you get that big you always have to have people talking about something about your company.
 
Yeah, and 2 years ago they were no longer a graphics company. They were a crypto company.

It's going to take a little more to convince me this is anything more than the latest in a long string of trends of the moment.
Fair nothing is over till they let the GPU competition beat them. For now Nvidia still has the fastest GPU you can buy. I think more people will believe what Jensen is saying when they let AMD or Intel ship a GPU that bests them and don't respond with a Super / Titan.

Nvidia is now having to deal with even more export bans... which now effect the 4090 gaming cards as well. Frankly being in the gaming business going forward is going to be a PITA for them anyway. If China keeps finding ways to import them from other countries the US gov is likely to expand the export ban beyond China. Gaming GPUs are getting to be an expensive business for Nvidia to deal with. I wonder how the new bans on 4090s is going to effect upcoming super versions even. I wouldn't even put it past the US gov to ask/tell Nvidia behind closed doors to knock it off with the gaming business in return for a few major US gov super computer wins that might even include Nvidia CPU hardware. It wouldn't be hard to buy Jensen off at this point... a contract for a fastest super computer in the world powered 100% by Nvidia. Jensen is offered that in return for exiting the GPU business so China can't get parts, I could see him taking that offer. If they exist the flagship space I doubt they stick around for mid range parts. Jensen would rather see them walk from GPUs then have someone else selling faster halo cards.
 
Watching Jensen on CNBC right now: "Chip independence." Such a funny phrase (as in "energy/oil independent"). 10-20yrs, says Huang. We'll have a new boogie man by then, of course.

He expects off the shelf AIs that can grind on private data, vs. the public data that current AI are trained on (a la health records), in a few years. Only caught a bit, so no mention of gaming.

-bZj
 
I am hard roflcoptering inside. This is like an Twitter is now X move.

All hail to our Nvidia AI (Skynet) overlord.
 
The original article:
https://www.newyorker.com/magazine/2023/12/04/how-jensen-huangs-nvidia-is-powering-the-ai-revolution
In the decade since Krizhevsky’s nine-page description of AlexNet’s architecture was published, it has been cited more than a hundred thousand times, making it one of the most important papers in the history of computer science. (AlexNet correctly identified photographs of a scooter, a leopard, and a container ship, among other things.) Krizhevsky pioneered a number of important programming techniques, but his key finding was that a specialized G.P.U. could train neural networks up to a hundred times faster than a general-purpose C.P.U. “To do machine learning without cuda would have just been too much trouble,” Hinton said.

Within a couple of years, every entrant in the ImageNet competition was using a neural network. By the mid-twenty-tens, neural networks trained on G.P.U.s were identifying images with ninety-six-per-cent accuracy, surpassing humans. Huang’s ten-year crusade to democratize supercomputing had succeeded. “The fact that they can solve computer vision, which is completely unstructured, leads to the question ‘What else can you teach it?’ ” Huang said to me.

The answer seemed to be: everything. Huang concluded that neural networks would revolutionize society, and that he could use cuda to corner the market on the necessary hardware. He announced that he was once again betting the company. “He sent out an e-mail on Friday evening saying everything is going to deep learning, and that we were no longer a graphics company,” Greg Estes, a vice-president at Nvidia, told me. “By Monday morning, we were an A.I. company. Literally, it was that fast.”


It sound like that was an e-mail send when it would been impressive to have been sent, like 2016-2017 or something
 
Last edited:
Wonderful, now we can can elevate and increase our headfirst disaster into AI worldwide destruction of jobs, careers, purpose, and even human life. I absolutely am 99% against AI research that isn't regulated. And there are zero regulations.

This is the one and only time I genuinely want him to fail completely. But we know he won't. Watch as these words age like fine wine. Watch and see when in 10 to 20 years you're unemployed and starving to death because some AI took over the need for you to have a job.
 
Regulation = loss of freedom. :athumbsup: 🇺🇸

More on CNBC: Huang was talking about completely replacing datacenters, with newly envisioned computing methods. NVidia is sticking to a hardware path to underscore the AI revolution. Good pitch for him.

-bZj
 
I absolutely am 99% against AI research that isn't regulated. And there are zero regulations.
Outside implementing a strong world government (and even then) I am not sure how it would be possible to not have unregulated AI research, it is so much open source, the tools needed are so common and cheap (a simple Apple M3 ultra laptop with the large memory can do a giant amount), African nation will not necessarily have the same issue about keeping fake jobs alive and not develop AI models to help their climate prediction and agricultural output, deliver nice personal medical helper and so on.

We can move who do the AI research a bit (maybe say slow google-microsoft-apple-facebook-tesla-universities and other entity under some control), that it will happen in the world in the next 100 years ?
 
Outside implementing a strong world government (and even then) I am not sure how it would be possible to not have unregulated AI research, it is so much open source, the tools needed are so common and cheap (a simple Apple M3 ultra laptop with the large memory can do a giant amount), African nation will not necessarily have the same issue about keeping fake jobs alive and not develop AI models to help their climate prediction and agricultural output, deliver nice personal medical helper and so on.

We can move who do the AI research a bit (maybe say slow google-microsoft-apple-facebook-tesla-universities and other entity under some control), that it will happen in the world in the next 100 years ?

Or you could take the Elezier Yudowski approach.

Mandate tracking of all GPU's. Mandate military strikes on every location they are amassed in large enough numbers to be used for AI training, regardless of which jurisdiction they are in, ensuring the destruction of every data center where AI training takes place.
 
Or you could take the Elezier Yudowski approach.

Mandate tracking of all GPU's. Mandate military strikes on every location they are amassed in large enough numbers to be used for AI training, regardless of which jurisdiction they are in, ensuring the destruction of every data center where AI training takes place.
Your ideas intrigue me…I would like to subscribe to your newsletter.
 
Or you could take the Elezier Yudowski approach.

Mandate tracking of all GPU's. Mandate military strikes on every location they are amassed in large enough numbers to be used for AI training, regardless of which jurisdiction they are in, ensuring the destruction of every data center where AI training takes place.
If we got all governments to work together on anything would be an amazing human accomplishment. It would really have to be classified and treated like weaponized viral research and nuclear weapon development but to a far worse level to require immediate action.
 
Or you could take the Elezier Yudowski approach.

Mandate tracking of all GPU's. Mandate military strikes on every location they are amassed in large enough numbers to be used for AI training, regardless of which jurisdiction they are in, ensuring the destruction of every data center where AI training takes place.
That seem the world global new order dictator approach I suggested as the only possible solution yes, even them AI training and research will continue to happen on regular computer not just data center.

If we got all governments to work together on anything would be an amazing human accomplishment. It would really have to be classified and treated like weaponized viral research and nuclear weapon development but to a far worse level to require immediate action.
The quality and quantity of Uranium needed is quite different than a regular Laptop, making it a really different challenge and soon (already kind of is) difference between compute and what called Ai today will disappear from our mind.
 
That seem the world global new order dictator approach I suggested as the only possible solution yes, even them AI training and research will continue to happen on regular computer not just data center.


The quality and quantity of Uranium needed is quite different than a regular Laptop, making it a really different challenge and soon (already kind of is) difference between compute and what called Ai today will disappear from our mind.
Fair point. And with such a small gate of entry into ai research, lots of errors will occur and lots of places and people will be eliminated. Thus really creating the need for AI to save us from ourselves haha.
 
Your ideas intrigue me…I would like to subscribe to your newsletter.

Not my ideas. I just read his open letter in response to the letter calling for a halt in research many AI scientists signed earlier this year. His take was that a pause as described in that letter didn't go far enough, and that we need 30 years or more of additional of research into AI alignment to human priorities before we can safely push forward the way we are doing.

It was published in Time Magazine among other places. I linked it in my previous post.

So this is stuff that got coverage in relatively mainstream media. No mailing list required :p

For a few weeks there there the "oh-so-[H]" "Robots are going to kill us all" news narrative got some real mainstream traction, before everyone forgot about it :p

I just think there is a lot of unfounded optimism in regards to just how "intelligent" AI can get, at least in its current form. It's mimickery based on recursive statistical models. AI becoming intelligent enough to be self aware and kill us all seems like such quaint Sci-Fi nonsense.

More likely, these models will just be efficiency tools that make real human workers more efficient, until they screw up and result in a setback that negates all of that efficiency.

I'm more concerned about the outcomes when people trust it as if it really were intelligent, and it instead fails spectacularly, killing companies, and maybe even killing people in the process.

As I have said before I think there will be a correction in the market before anything else, when investors finally realize that AI will be a marginal and evolutionary change to how we work, and not an absolute game changer, and how that marginal evolution in productivity comes with some real risks of failure.

There will be a place for AI, no doubt, but there is WAY too much evangelizing about it out there right now for me to take it seriously, because whenever someone evangelizes anything, they are probably wrong. At least I advent seen a case where they weren't yet.
 
Last edited:
anything more than the latest in a long string of trends of the moment.
I have a feeling that this isn't a trend, unless there's a good shift in the software space away from CUDA towards, well, anything else. (Gaudi has been hitting it - watch it.)
 
“Jensen sent out an email on Friday evening saying everything is going to deep learning, and that we were no longer a graphics company. By Monday morning, we were an AI company."

It’s no secret that Nvidia has quickly morphed into an AI company. Although it creates some of the best graphics cards for PC gamers, the company’s supercomputing efforts have catapulted it into being a trillion-dollar company, and that transformation was spurred on by the monumental rise of ChatGPT. That shift, from a graphics company to an AI company, was an intentional choice by Nvidia’s CEO Jensen Huang.

In a moment of saying the quiet part out loud, Greg Estes, the vice president of corporate marketing at Nvidia, said: “[Jensen] sent out an email on Friday evening saying everything is going to deep learning, and that we were no longer a graphics company. By Monday morning, we were an AI company. Literally, it was that fast.”

https://www.digitaltrends.com/computing/nvidia-said-no-longer-graphics-company/
https://www.newyorker.com/magazine/2023/12/04/how-jensen-huangs-nvidia-is-powering-the-ai-revolution

View attachment 616781
Ok
 
Folks, this doesn't mean GPUs will suddenly be neglected. I see this more as NVIDIA acknowledging that its business is evolving, and it can't just stay the course.

A good parallel is Apple. The company's business initially revolved around the Mac, but it was happy to pivot to the iPod, to the iPhone, to... you get the idea. That didn't mean Apple forgot about the Mac (although there was that 'dark time' of bad choices in 2013-2018), but it didn't insist on prioritizing that business section at all costs. NVIDIA can focus on courting AI customers while hopefully still making good GPUs.

If you want an example of the danger of fixating on a certain business segment, just look at Microsoft under Ballmer. He was so obsessed with protecting the Windows PC business that he either ignored the changing landscape or undermined efforts to change. The Windows Mobile/Phone team was perpetually starved of resources, and had to always play second fiddle to the PC. The Xbox lost its focus on gaming as Ballmer tried to make it a Trojan horse for Windows experiences in the living room. By the time he recognized what was going on, it was too late; Microsoft went from absolute dominance of computing to just another big name alongside Apple and Google.
 
Folks, this doesn't mean GPUs will suddenly be neglected. I see this more as NVIDIA acknowledging that its business is evolving, and it can't just stay the course.

A good parallel is Apple. The company's business initially revolved around the Mac, but it was happy to pivot to the iPod, to the iPhone, to... you get the idea. That didn't mean Apple forgot about the Mac (although there was that 'dark time' of bad choices in 2013-2018), but it didn't insist on prioritizing that business section at all costs. NVIDIA can focus on courting AI customers while hopefully still making good GPUs.

If you want an example of the danger of fixating on a certain business segment, just look at Microsoft under Ballmer. He was so obsessed with protecting the Windows PC business that he either ignored the changing landscape or undermined efforts to change. The Windows Mobile/Phone team was perpetually starved of resources, and had to always play second fiddle to the PC. The Xbox lost its focus on gaming as Ballmer tried to make it a Trojan horse for Windows experiences in the living room. By the time he recognized what was going on, it was too late; Microsoft went from absolute dominance of computing to just another big name alongside Apple and Google.
And let’s face it, Nvidia faces serious competition on the AI front from a number of other companies who are designing bespoke hardware for AI acceleration.

Nvidia is not at all facing that level of competition from AMD or Intel on the consumer or workstation graphics front.
Intel is doing well at the bottom, AMD is offering some resistance in the middle, but Nvidia just needs to do a slight price correction which is 100% within their capabilities and they continue to dominate the consumer GPU space top to bottom with at best a paper thin resistance.
 
And let’s face it, Nvidia faces serious competition on the AI front from a number of other companies who are designing bespoke hardware for AI acceleration.

Nvidia is not at all facing that level of competition from AMD or Intel on the consumer or workstation graphics front.
Intel is doing well at the bottom, AMD is offering some resistance in the middle, but Nvidia just needs to do a slight price correction which is 100% within their capabilities and they continue to dominate the consumer GPU space top to bottom with at best a paper thin resistance.
The problem of course being that Nvidia doesn't have enough fab space.
The 4090 is a problem for them currently. They would rather not make it.
The few they do make are what has accounted for their uptick in sales the last two quarters... as Chinese customers have been buying all they could get before last months import band on 4090s. Now they will have to use third parties to hoover up as many as they can.

I think the question going forward is does it make sense for Nvidia to be a mid range GPU provider? I'll call it now Blackwell gaming cards are NOT coming. Nvidia will ship 4090 super parts sure... but they are done using their top end fab space for GPUs imo. If they stay in the business of GPUs the next GPU after the 4090 super would be a Blackwell refresh on a less premium at that time fab process. However that only happens if Intel and AMD aren't shipping anything that would make it look bad. If AMD or Intel is shipping a new GPU that would be actual competition for a blackwell refresh. I think its pretty realistic that Jensen just says we are done bitches we retire king.
 
Rasterization is required to display anything 3d on a monitor. You would have to change display technology or else move the rasterizer if you wanted to remove that from the graphics card. And since the graphics card is the device which outputs to the display, it doesn't really make sense anywhere else.

Until we get real 3D displays (as in, not two 2d images finagled together to look 3d) or find another method of translating 3d scenes into 2d images, raster is here to stay. I'm sure someone is looking into this, but I've heard absolutely nothing about it.

And let’s face it, Nvidia faces serious competition on the AI front from a number of other companies who are designing bespoke hardware for AI acceleration.
We'll see. There are lots of companies that are claiming they are going to have amazing neural net accelerators, and who knows, maybe they will. Maybe they'll completely trounce nVidia and they will be an "AI has-been" just that fast... Or maybe they won't. It's easy to talk about how your new, as of yet unreleased, product is going to be the most amazing thing in the history of ever, it is harder to deliver. I don't even know if I can name all the CPU and GPU claims throughout history that have not lived up to their hype or straight out not panned out.

So as with most things: I'll believe it when I see it. When someone produces the "nVidia killer" and it is in the hands of companies, being used on actual workloads, and demonstrating amazingly superior performance then I'll believe it. Until then, I remain skeptical.

Also there's the little issue that nVidia keeps improving their stuff as well.
 
Nvidia doesn't have enough fab space.
With all the fab projects around the world that launch the 2020 supply issues, it would be a bit short-sighted that fab access will be an issue, same goes for we cannot make enough H100, that could be a situation that will continue for months not years.

Most gaming card will be on old node, not on the latest will simply continue to be the norm (you cannot buy a gaming gpu on N3 outside apple right now), by 2025 when gaming gpu on it launch N3 will be 2 years old.
 
When someone produces the "nVidia killer" and it is in the hands of companies, being used on actual workloads, and demonstrating amazingly superior performance then I'll believe it. Until then, I remain skeptical.

Do a Google search for "Ati 9700 Pro reviews". Then sit back and prepare to be astonished that someone unbelievably produced an "Nvidia Killer".
 
The problem of course being that Nvidia doesn't have enough fab space.
The 4090 is a problem for them currently. They would rather not make it.
The few they do make are what has accounted for their uptick in sales the last two quarters... as Chinese customers have been buying all they could get before last months import band on 4090s. Now they will have to use third parties to hoover up as many as they can.

I think the question going forward is does it make sense for Nvidia to be a mid range GPU provider? I'll call it now Blackwell gaming cards are NOT coming. Nvidia will ship 4090 super parts sure... but they are done using their top end fab space for GPUs imo. If they stay in the business of GPUs the next GPU after the 4090 super would be a Blackwell refresh on a less premium at that time fab process. However that only happens if Intel and AMD aren't shipping anything that would make it look bad. If AMD or Intel is shipping a new GPU that would be actual competition for a blackwell refresh. I think its pretty realistic that Jensen just says we are done bitches we retire king.
Well nobody has enough fab space TSMC is making promises but their delivery dates are in the 2025+ range for access.
Does it make sense for the consumer Blackwell GPU’s to be on Samsung, with the enterprise stuff on TSMC 3.
TSMC 3 is behind schedule in just about all aspects.
 
  • Like
Reactions: ChadD
like this
Not all those fabs are going to be bleeding edge enough to handle the requirements of Apple, AMD and nVidia though. So it's going to be interesting.
 
With all the fab projects around the world that launch the 2020 supply issues, it would be a bit short-sighted that fab access will be an issue, same goes for we cannot make enough H100, that could be a situation that will continue for months not years.

Most gaming card will be on old node, not on the latest will simply continue to be the norm (you cannot buy a gaming gpu on N3 outside apple right now), by 2025 when gaming gpu on it launch N3 will be 2 years old.
Indeed. This is my point. People getting all excited about a 5090 should chill out now. (any 5090 we get won't be based on this upcoming arch) Blackwell is not going to be coming to consumer cards until it is refreshed if it comes at all. Lots of reports and leaks regarding 512 bit bus blackwell chips with 192sm units sold as 5090s and the like... those chips exist they just aren't coming to any consumer level GPU. They are slated to be ready late 2024... but I highly doubt they get a gaming launch. There is no way Jensen is going to dedicate even cast off parts for the consumer market. (as China has proven even 4080 level cast offs will get plucked out of consumer cards to be re seated) That pushes a consumer launch of a "5090" type card assuming Nvidia does release one out to blackwell refresh territory which is more like end of 2025 maybe even 2026. If AMD and Intel don't push a 4090 super refresh they might not even release a gaming card on that either. If Nvidia doesn't just abandon gaming... if AMD and Intel can't solidly dethrone a 4090 super they might pull a 5000 release that is something like 5060/5070 only cards based on a very cut down blackwell die wiht more DLSS hype to compensate.
 
This is the same as saying Procter & Gamble Company (P&G) is no longer a laundry detergent company.
 
Well nobody has enough fab space TSMC is making promises but their delivery dates are in the 2025+ range for access.
Does it make sense for the consumer Blackwell GPU’s to be on Samsung, with the enterprise stuff on TSMC 3.
TSMC 3 is behind schedule in just about all aspects.
To be honest no it doesn't make sense. Not to me... and I know it makes zero sense to investors right now.
If Nvidia was able to get working blackwell chips out of Samsung. They would be expected to get them into higher margin products as well.
The only consumer level chips that make sense for Nviida going forward are very cut down versions that would have lower value in the AI market. (Assuming Nvidia could still use thier current geforce pricing scheme)
Essentially anyting that gets close to even 4080 level of performacne at this point is going to be subject to export bans... and if Nvidia does produce them China will have cut out purchasers snapping them up.

If Nvidia stays around in the consumer space. Their cards are going to get more epxensive for less impressive products. Which imo is exactly why Jensen is preparing the company to walk from consumer GPUs. It really makes no sense right now unless you really believe AI is going to bust. Which Jenson does not.
 
If Nvidia was able to get working blackwell chips out of Samsung. They would be expected to get them into higher margin products as well.
Ampere card had the A100 high end on TSMC 7 and gaming on samsung 8. If you can sell the high end $35,000 instead of $30,000 because of the better node, the die price difference can become quite negligible but make sense for the $300 video cards, let alone opening up higher volume option.
 
Ampere card had the A100 high end on TSMC 7 and gaming on samsung 8. If you can sell the high end $35,000 instead of $30,000 because of the better node, the die price difference can become quite negligible but make sense for the $300 video cards, let alone opening up higher volume option.
That is the old math. Sure that used to be the case. Now Nvidia is charging 30k for their top end silicon. Their L40 which is 4090 silicon (ok 4090 has I think something like 5-10% of the chip disabled) with doubled ram chips is selling for 10x a 4090. To make it even sweeter for Nvidia there is essentially no middle men on that deal. Its pretty hard to justify putting top end silicon like the AD102 into consumer gaming cards selling for even 2k if you can sell a Datacenter version with no middle man hardware vendors attached for 10-14k a pop.

If Nvidia was to get Samsung to spit out blackwell based chips next year or even the year after... I don't see how they can convince investors that they don't have a responsibility to be putting them in data center cards. Savvy investors are aware and already a little annoyed that 4090 and even 4080 stock is getting wisked away to be broken down and re-seated into data center product. (It is not just China doing it). Really if they could get decent parts out of Samsung it might actually solve some issues for Nvidia giving them a good source of chips to sell second tier AI accleration to emerging markets (and maybe even China offiically... that isn't likely but its more a possibility then Nvidia being allowed to sell them the real stuff)
 
Back
Top