Cramer Hails Nvidia's 'Predictably Fabulous' Jensen Huang as a Visionary CEO

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,786
Impressive, and it's not the Cosmo Kramer from Seinfeld

"“If you have enough Nvidia cards put together, you can enable all of this incredible artificial intelligence stuff that everybody's so excited about now. It's much more efficient than using CPUs — think what Intel makes,” the “Mad Money” host reportedly said.

Cramer said that Nvidia knew the potential of generational AI models long back and “had the cards ready for all who wanted them when Wall Street finally came around to generative artificial intelligence.”

Nvidia ended Friday’s session at $389.46, up 2.54%, according to Benzinga Pro data."

Source: https://www.benzinga.com/markets/pe...iscalnotes-nyse-note-frontier-view-helps-navi
 
Call "AI" what you will but in the business world Nvidia is the only company currently able to run the models and those models are capable of processing absurd amounts of data, organizing it, and reporting back correlations and patterns that it would take an unwieldy amount of man hours to accomplish, to the point where it just wasn't feasible.
 
Call "AI" what you will but in the business world Nvidia is the only company currently able to run the models and those models are capable of processing absurd amounts of data, organizing it, and reporting back correlations and patterns that it would take an unwieldy amount of man hours to accomplish, to the point where it just wasn't feasible.
Oh, I agree.
 
  • Like
Reactions: erek
like this
Call "AI" what you will but in the business world Nvidia is the only company currently able to run the models and those models are capable of processing absurd amounts of data, organizing it, and reporting back correlations and patterns that it would take an unwieldy amount of man hours to accomplish, to the point where it just wasn't feasible.
AMD cards can run these models too. I've ran stable diffusion, YOLO object detection, and numerous chatbots on my 6700 XT and they all ran great.
 
You're the one who mentioned consumer in the first place? :confused:
And? What's your point? How does the fact that consumer cards can run them take away from the fact that AMD also sells enterprise cards?

Were you not aware that the Instinct line exists? I assumed hardware enthusiasts would be aware.

My point is that the statement "in the business world Nvidia is the only company currently able to run the models" is completely false.
 
And? What's your point? How does the fact that consumer cards can run them take away from the fact that AMD also sells enterprise cards?

My point is that the statement "in the business world Nvidia is the only company currently able to run the models" is completely false.
Run the models feasibly at scale and cost effectively... It's implied already by Lakados ' post. The Nvidia solutions are MUCH faster.
 
I've already done the research and willing to bet I'm more experienced in this space than you are.
The MI200 has slower performance in FP16 and int8 workloads but offers much more memory than the A100. Also the MI200 completely destroys Nvidia in FP64 workloads, although this is more relevant for HPC not AI.
The point is that AMD is absolutely competitive here. But of course facts dont matter to certain people, and I'm wasting my time right now.
The lesson here is that if I want to get Cramer-esque takes on computer hardware, this forum is the place to be. Personally I'd rather spend my time actually working on this stuff instead of pointless, ignorant debates.
 
I've already done the research and willing to bet I'm more experienced in this space than you are.
The MI200 has slower performance in FP16 and int8 workloads but offers much more memory than the A100.

The point is that AMD is absolutely competitive here. But of course facts dont matter to certain people, and I'm wasting my time right now.
The lesson here is that if I want to get Cramer-esque takes on computer hardware, this forum is the place to be. Personally I'd rather spend my time actually working on this stuff instead of pointless, ignorant debates.
Thanks for clarifying you agree with me. Memory capacity is irrelevant, the model fits or it doesn't. The workloads you specify the amd solutions are slower at are exactly those used for ai inference. Keep the personal insults and chest beating to yourself, please.
 
You're really splitting hairs here. Both AMD and Nvidia have consumer and professional/enterprise GPUs that can run these models.
AMD really doesn't, the lack of Int4 and Int8 on the Instinct hardware leaves them between 1/30'th and 1/100'th of the speed of Nvidia's current hardware set, it is why AMD is talking about how the MI300 series will be their first cards capable of running modern AI models, Chat GPT does not even work on AMD hardware there are projects like Vicuna for RocM that rework the models for FP16, but the supported hardware set there is very limited, and performance is still meh.
 
Cramer shills for companies right before they collapse into heaps of dung and ruins

Good luck nV not looking good for you when the guy who knows nothing shills for you -- Cramer
 
Funny, that's the exactly how I feel. Have a good day. I suggest you use ignore on Lakados and I so you don't have to suffer from corrections if you think either of us have no idea what we're talking about.
I read that as "you should put us on ignore so you can't correct us on our Nvidia shilling"

Don't worry guys, I don't post here much anyway.
 
And the facts are there are plenty of institutions purchasing instinct cards for mixed workloads. AMD cards are now competitive in this space. If you can't handle these sort of statements then feel free to ignore me.

Personally I don't play favorites and think competition is actually a good thing - what a concept, right?
 
Last edited:
Stop with the Trolling please. Address the topic with pertinent arguments. The name calling and trolling each other personally will get you banned.
 
  • Like
Reactions: erek
like this
There was an econ prof who had a mock account that invested in cramer's picks, lost 64% in a year or so, if I remember correctly. How this asshole still has a job is beyond my understanding, wait its 2023. Also there's an actual inverse cramer etf now so...

https://www.marketwatch.com/investing/fund/sjim
 
  • Like
Reactions: erek
like this
1685307041649.png
 
If Jim Cramer says something is amazing - Sell promptly. The stooge has an amazing track record for being completely wrong. If you just did exactly the opposite of what he said you’d be rich.

Besides that - why would anyone successful on the market be wasting their time on TV?
 
Besides that - why would anyone successful on the market be wasting their time on TV?
If you think about it, why did Nvidia sell GPU's instead of just hording them all and mining with them to make bank? Kinda makes you wonder.
 
  • Like
Reactions: erek
like this
What will be interesting to me is what the market eventually decides it’s ultimately worth. The P/E ratio and all of those classic metrics suggest the valuation is high, but if I’ve learned anything from watching stocks like Tesla, it’s that none of that matters once the animal spirits take over, and it becomes almost impossible to predict pricing based on traditional metrics. What’s high when everyone has decided that the sky is the limit? 2023 will be an interesting year for stocks in this space, and by the looks of it, all eyes are on Nvidia to lead the way.

One thing that isn’t up to debate is that Nvidia has shown itself to be the real deal when it comes to computing. Compare their technological evolution to a company like Intel, which appears to have sat in the corner for a solid decade eating glue while the world changed around them. Huang is up there as one of the best CEOs in tech, no doubt about it.

As always, personal opinion and observation, not an endorsement, and not investment advice.
 
Last edited:


Wrong about that crash, and wrong about the current inflation crash. Wrong about everything.

Even if he might be correct in this instance, my reflex is to do the exact opposite of what he says. He's a paid shill / idiot. Why is he wasting his time on television if he's so amazing at playing the market?
 


Wrong about that crash, and wrong about the current inflation crash. Wrong about everything.

Even if he might be correct in this instance, my reflex is to do the exact opposite of what he says. He's a paid shill / idiot. Why is he wasting his time on television if he's so amazing at playing the market?

Reminds me of Michael Pachter.
 
One thing that isn’t up to debate is that Nvidia has shown itself to be the real deal when it comes to computing.
In what way? Before the AI craze, they were riding high on crypto because graphic cards crunch math really well. It's not like Nvidia is the only one capable of doing AI either, just that they do it very well. I'm still not even sure what application people need these AI accelerators for, and that's the elephant in the room people are ignoring.
Compare their technological evolution to a company like Intel, which appears to have sat in the corner for a solid decade eating glue while the world changed around them.
Yea but Intel actually makes a lot of products companies need, including Nvidia. Nvidia makes graphic cards or AI accelerators that need to go into a computer, and that computer will likely be an Intel.
Huang is up there as one of the best CEOs in tech, no doubt about it.
From my perspective it seems Haung is creating the biggest pump and dump stock scheme of the decade. Their stock keeps going up with nothing to show but empty promises of AI. Typically in these situations the shareholders with the biggest amount leave, and take the majority value of the stock with them. That's what is going to happen with Nvidia's stock.
https://www.reddit.com/r/wallstreet.../prepare_to_get_aied_in_your_ai_bears_nvda_1/
 
  • Like
Reactions: erek
like this
Their stock keeps going up with nothing to show
Nothing to show ?

April 30, 2017: 1.937B
April 30, 2018: 3.207B
April 30, 2019: 2.22B
April 30, 2020: 3.08B

If the next quarter do $11B, it will be close to 500% grow in 6 years...

. I'm still not even sure what application people need these AI accelerators for,
You can easily come up with a lot of them, you can easily see why speech to text could be nice, object detection in pictures and other visions, you can come up easily with a giant list of machine learning applications.

You can google machine learning in X and find a bit what is going on in that field, medecine, pharma, petrol, synthetic protein, agriculture:
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8356896/

Machine Learning in Drug Discovery: A Review​


https://www.sciencedirect.com/science/article/pii/S2667318521000106

Machine learning in agriculture domain: A state-of-art survey​


https://www.sciencedirect.com/science/article/pii/S2096249521000429

Application of machine learning and artificial intelligence in oil and gas industry​


In what field is ML not already massively used or going to be in the 2020s ?
 
As someone who actually runs and maintains nvidia enterprise hardware (DGX1x2, DGX2, DGXA100), the AMD solutions are nowhere near as fast or cost effective. The eco system for nvidia has grown so much since 2017 and has made the technology much easier to access, setup and get going than when I first started working on this stuff.
 
Back
Top