Nvidia pulls in some $18.1B up from $5.93B this time last year...

It's all virtual wealth. It isn't actually backed by deliverable asset. It's a gamestop type hype value. Nothing more. Temporary wealth is not real wealth. It will vaporize. Nvidia is volatile and should be regarded as get quick rich or lose alot if you aren't careful.
Nvidia made 9-10 billion of net income in 3 months selling physical tools and toys.... calling this all virtual wealth not backed by deliverable asset is a bit strange.

Nvidia potential future giant success and great future management is already pretty much all backed in its current price, I would not advice anyone to see it as a possible get quick rich vehicle, it will not go from 1.2 trillion to a large multiple of 1.2 trillion quick, it is a really long play if you get in now.
 
Nvidia made 9-10 billion of net income in 3 months selling physical tools and toys.... calling this all virtual wealth not backed by deliverable asset is a bit strange.

Nvidia potential future giant success and great future management is already pretty much all backed in its current price, I would not advice anyone to see it as a possible get quick rich vehicle, it will not go from 1.2 trillion to a large multiple of 1.2 trillion quick, it is a really long play if you get in now.
Just watch as it plays out. All I'm gonna say.
 
It's all virtual wealth. It isn't actually backed by deliverable asset. It's a gamestop type hype value. Nothing more. Temporary wealth is not real wealth. It will vaporize. Nvidia is volatile and should be regarded as get quick rich or lose alot if you aren't careful.

Gamestop surged because some kids on Reddit wanted to crush a hedge fund for the lulz. That business model is dying and everyone knows it. The people buying the stock knew it, they just saw a massive short position and knew the shorts would need to cover if enough people managed to move it higher. Nvidia is not Gamestop. Nvidia is a company that saw a trend coming years ago and positioned itself to take advantage. They're not only dominating in the hardware space, they're dominating in the API as well. Their greatest risk is companies like Microsoft and Google using their resources to develop more in-house tools, but as of now, they're firmly in the driver seat and have the best hardware and API for the foreseeable future in an industry that's growing for a good business reason.
 
All Fiat, all printed, all gonna tank when Brics comes on line. Better be buying hard assets now. Au, Ag, Pt. Etc....
 
*snip*

As far as I am concerned, large scale use of AI is just one Pandora's box that is better left unopened. It won't kill us all directly. It will never become that competent. But indirectly, through misplaced trust, and social upheaval? It very well may kill us all.
I do agree with most of your post I snipped in my quote, but the portion I quoted is my true fear.

Blind trust in AI's output/its conclusion that it draws. The leap that occurs where there is no trained human involved in the decision making process prior to action being taken. Keyword being trained/experienced (maybe thats the better word).
 
So, I might as well ask. Why do you care about auto-HDR so much? I've long used HDR displays but would never consider auto-HDR a viable feature in of itself, let alone a reason to upgrade an OS. Is it simply because you know you have HDR displays and that Windows 11 is "activating" that feature for you? Or do you genuinely believe that auto-HDR is better than SDR?
About 1-2 years. Then it'll burst. Then we'll figure out what it's really useful for and build that.
 
I do agree with most of your post I snipped in my quote, but the portion I quoted is my true fear.

Blind trust in AI's output/its conclusion that it draws. The leap that occurs where there is no trained human involved in the decision making process prior to action being taken. Keyword being trained/experienced (maybe thats the better word).
Then you won't like this:

https://www.businessinsider.com/us-...de-kill-humans-artifical-intelligence-2023-11

1700853775780.png
 
I mean not exactly new, Samsung launched the SGR-A1 platform back in 2007.
I have to assume it's only gotten better in the last 16 years.
There always exists the option to equip them with a non-lethal loadout and let them make a decision.
AI making the decision to kill humans is what The Creator is basically about, by the way.
 
As long as they go Farnsworth, I don't object:
149.jpg

Wernstrum's don't hold a candle, I'm forced to side with the housewives on this one.

Sorry, first thing that came to mind on the topic.

Jokes aside... like all AI, it sounds great on paper, the technology is flaunted as flawless... until it is not. And in this case, the stakes could not be higher.

Accountability usually seems to be an afterthought when these types of things are implemented. Who's liable when this inevitably kills a noncombatant?

There's just too much faith in this technology (AI of any form) for me to feel remotely comfortable about relying on it.
 
Too many of these companies right now are playing with fanciful research projects without a real goal in mind. AI for the sake of AI. For some vague promised "future". This is much like how in the 90's many companies had websites, and bragged to their shareholders about their websites, but at the same time couldn't really explain why they had websites, and what value they added. Or how the long Island Iced Tea company saw its share price skyrocket after doing nothing but just renaming itself Long Blockchain Corp.
The goal is models that give a majority of people a 30% boost in productivity without dimming the lights of the whole planet.

However, how we finally get there might leave nVidia looking like SGI, which would be ironic. Or Google looking like Yahoo.

And the VC's sure as shit can't tell one AI researcher apart from another, so its random chaos. But someone is going to win huge. You don't need AGI, just really good cross industry classifiers that run on next decades GPU.
 
The goal is models that give a majority of people a 30% boost in productivity without dimming the lights of the whole planet.

However, how we finally get there might leave nVidia looking like SGI, which would be ironic. Or Google looking like Yahoo.

And the VC's sure as shit can't tell one AI researcher apart from another, so its random chaos. But someone is going to win huge. You don't need AGI, just really good cross industry classifiers that run on next decades GPU.
Or just do the brain dead repetitive tasks that are so boring that you can’t pay a human to do them because they get so bored they quit to maintain their sanity or they are so tuned out of the job latching to any source to keep their brain engaged that they are completely unproductive. They either get nothing done or what they do have done is laced with so many errors it’s unusable, those are perfect jobs for AI. I remember getting paid damned near $25 an hour in the 90s transcribing medical data, just trying to read badly smudged, stained, incomplete, misspelled forms and transcribe them into digital forms. No music, no phones, no internet, and no coffee, just a Windows 3.x machine with an overly bright CRT and endless bankers boxes with the sort of filing you would expect from a toddler. Most quit in a week or 2, I managed stuck it out for the 2 months because I had tuition to pay, it was hell.
 
Or just do the brain dead repetitive tasks that are so boring that you can’t pay a human to do them because they get so bored they quit to maintain their sanity or they are so tuned out of the job latching to any source to keep their brain engaged that they are completely unproductive. They either get nothing done or what they do have done is laced with so many errors it’s unusable, those are perfect jobs for AI. I remember getting paid damned near $25 an hour in the 90s transcribing medical data, just trying to read badly smudged, stained, incomplete, misspelled forms and transcribe them into digital forms. No music, no phones, no internet, and no coffee, just a Windows 3.x machine with an overly bright CRT and endless bankers boxes with the sort of filing you would expect from a toddler. Most quit in a week or 2, I managed stuck it out for the 2 months because I had tuition to pay, it was hell.
There's also a lot of industries that also require specialization that for whatever reason people are refusing to do.

One of them that I feel like not enough people are aware of is Court Reporters. It's a career that basically doesn't even require a high-school education, takes 1-2 of years of specialized training to get into, and starts at over $100k a year with full benefits; at least in the State of California. It is so in demand that basically if you can pass the associated certification tests, you will have recruiters calling you to give you work. And even if you don't want to work inside the public court system, you could work private instead to do depositions for lawyers. I don't know of any other career path that gets to that level of guaranteed work more than Court Reporting at this point.

But whether it's about being too unknown, too uninteresting, too difficult, too scary, or some combination there-of, people aren't filling those jobs. And it's a job that is basically necessary for our legal system to work.

I personally don't think it's worth getting into because I think the courts will be forced to adopt some form of automated transcribing. Once it's possible for an AI to accurately identify multiple different voices at the same time and assign them to the correct people in the transcript and get words (especially legal terms) with 99.99% accuracy (which at that point would likely be more accurate than a human), I don't think there is a way the courts can say no. Even with how slow the court system is to change, especially technological change.

EDIT: (The only reason why they would not is the live aspect of it. There is a certain paranoia inside of the court system that I actually think is a good thing. They want the human factor transcribing in the room, especially as it becomes more possible in the future to alter video/audio recordings or that the types of errors the AI could make or be intentionally made to make, could result in a lot of abuses and injustices. Theoretically a human transcriber is supposed to be another check and balance as a totally neutral party in the court room. Still it's debatable that if a court reporter starts now, if they'll have the ability to retire from it in 40 years).
 
Last edited:
Or just do the brain dead repetitive tasks that are so boring that you can’t pay a human to do them because they get so bored they quit to maintain their sanity or they are so tuned out of the job latching to any source to keep their brain engaged that they are completely unproductive. They either get nothing done or what they do have done is laced with so many errors it’s unusable, those are perfect jobs for AI. I remember getting paid damned near $25 an hour in the 90s transcribing medical data, just trying to read badly smudged, stained, incomplete, misspelled forms and transcribe them into digital forms. No music, no phones, no internet, and no coffee, just a Windows 3.x machine with an overly bright CRT and endless bankers boxes with the sort of filing you would expect from a toddler. Most quit in a week or 2, I managed stuck it out for the 2 months because I had tuition to pay, it was hell.
All complex tasks are just a bunch of brain dead repetitive tasks. The idea right now is to try and replace people in high paying jobs with AI, because ultimately that's what they do.
 
All complex tasks are just a bunch of brain dead repetitive tasks. The idea right now is to try and replace people in high paying jobs with AI, because ultimately that's what they do.
Well all I can say about my time doing the medical transposing was that the best people working there were the crackheads. They were on time, worked hard, and weren’t the worst hygiene offenders, some of the neckbeards stank to high hell.
The place asked very few questions about the people working there, having bodies doing the work was more important than their personal lives.
But all we got for training was a 45 min to 1h show and tell session. So I can’t really say anything about it was complex on any technical level, you just needed to know how to read shitty hand writing or fake it well enough so nothing obvious seemed wrong.

But I’m 100% sure that mistakes down there lead to people having all sorts of complications.
And I would bet that boredom and carelessness contributed to a lot of mistakes.
 
For anyone thinking its crazy to consider nvidia might just exist gaming..... ya its not crazy.

We as enthusiasts don't want to hear it... but that uptick in gaming sales the last couple quarters. Is just Chinese companies end rounding import bans, snapping up and importing 4090s. They then promptly pull the silicon and GDDR out toss the rest of the cards onto the secondary market for parts... and reseat the GPUs into blower cards for racks.

Unless something seriously changes with the AI market in the next year or so. I wouldn't be shocked to see Nvidia just skip an entire generation of gaming cards. They did it with the tesla parts a few years back as the competition didn't push the 1080s much anyway... so they just dedicated all that generations silicon to data center. Considering how much stronger the data center margins are don't be shocked if we see a 4090/80/70 super launch and then nothing new for gaming for a few years, assuming Nvidia doesn't just exit gaming at that point if AI hasn't bubbled itself. I mean see it from their perspective gaming will be 1/8th the size of data center at 1/3 the profit margin, and they already can't get enough silicon. The only gaming cards that would make sense for them would have the be on an inferior process. (or be C string cast offs) If Intel and AMD both have decent gaming GPUs on the market Nvidia might not really want to complete and siphen good silicon... nor would they want to release inferior silicon and be the #3. Might be best for them marketing wise to just exit gaming while they are seen as king.
 
We as enthusiasts don't want to hear it... but that uptick in gaming sales the last couple quarters. Is just Chinese companies end rounding import bans, snapping up and importing 4090s. They then promptly pull the silicon and GDDR out toss the rest of the cards onto the secondary market for parts... and reseat the GPUs into blower cards for racks.

PC Gaming is alive and well, and is growing believe it or not, as "PC M<aster Race" type social media groups convince more and more kids that the best gaming experience is on the PC, and to abandon their consoles. PC Gaming has been in a renaissance for several years now.

Gaming has been growing fantastically fast since probably ~2015 or so, when all the color

That said, most buyers right now are hunting used parts, because new ones are priced out of reach for most people. PC Gaming was never a cheap hobby, but since the start of the pandemic in early 2020, the hobby has just become more and more expensive, and that has certainly slowed sales.

I don't for a second believe that it is because PC gaming is stagnating or becoming less popular. It's just that people either can't afford, or don't want to pay for the poor price/performance ratio of the latest gen parts.
 
I don't for a second believe that it is because PC gaming is stagnating or becoming less popular. It's just that people either can't afford, or don't want to pay for the poor price/performance ratio of the latest gen parts.
Let's face it - an almost 7 year old 1080ti is still enough GPU to run upcoming games on high quality. About the only thing you're missing out on is ray tracing, and is ray tracing really worth the gigantic increase in cost that a current gen card costs?
 
Let's face it - an almost 7 year old 1080ti is still enough GPU to run upcoming games on high quality. About the only thing you're missing out on is ray tracing, and is ray tracing really worth the gigantic increase in cost that a current gen card costs?

Things have changed in the last year though. Starfield - for instance - does not have raytracing. 1080p performance hovers in the low to mid 20fps range on a 1080ti

And yes, Starfield running on Bethesda's ganky Creation engine is highly inefficient for the visuals it performs, but still.

Cyberpunk Phantom liberty manages a solid 60-70 fps average at 1080p on a 1080ti without raytracing, so it is totally playable.

As is Baldurs Gate 3 at ~85 fps average with lows in the mid 50's at 1080p using a 1080ti

But kids these days are also not happy unless they are getting like 120fps-165fps, so I don't know. My take on what is considered "playable" may no longer be relevant here.

I'm not sure exactly what is going to happen in gaming, but games are starting to break through the 1080ti barrier, and eventually the kids are going to have to start buying new GPU's, get increasingly unsatisfying performance, or give up and go back to playing on consoles. Which of those actually happens is my best guess. I guess a little bit of each.
 
PC Gaming is alive and well, and is growing believe it or not, as "PC M<aster Race" type social media groups convince more and more kids that the best gaming experience is on the PC, and to abandon their consoles. PC Gaming has been in a renaissance for several years now.

Gaming has been growing fantastically fast since probably ~2015 or so, when all the color

That said, most buyers right now are hunting used parts, because new ones are priced out of reach for most people. PC Gaming was never a cheap hobby, but since the start of the pandemic in early 2020, the hobby has just become more and more expensive, and that has certainly slowed sales.

I don't for a second believe that it is because PC gaming is stagnating or becoming less popular. It's just that people either can't afford, or don't want to pay for the poor price/performance ratio of the latest gen parts.

I'm not saying PC gaming is dead. I'm saying for Nvidia the more high end GPUs they produce the more profit they leave on the table. Those same chips can be making them 10x as much right now. Even if by some miracle they tripled their Fab output that would still be true. Gaming isn't dying no... but Nvidia seems to be in a unique position where their chips have not just a little but like 10x the value in another market.

If Nvidia doesn't leave gaming. Their cards are going to get even more expensive. Worse if people think its bad that Nvidia sells a 4090 which would have been 80 level silicon last gen. Just wait till they start calling "5070" level parts 5090s next year.
 
Even if by some miracle they tripled their Fab output that would still be true.
Not so sure about triple, but that level of demand for AI compute will not be for a long time (at least chance are) and AI compute will probably be on the latest node unlike gaming stuff.

To put it in a very extreme case, chance are non zero that NVidia will spent many tens potentially hundreds of millions of dies on the next Nintendo Switch, the lowest of gaming we can think of. They will not be on TSMC N3 or on the best Samsung node and will not directly compete with those product volume in that way at least.

in 2024 expensive on a recent node for the high end market release will be pure highest margin enterprise product, by 2025 not on the latest and greatest node anymore (by almost 2 full years by then) gaming card will be released, not in direct competition with the B100 for the high demand window of that product.
 
Last edited:
Not so sure about triple, but that level of demand for AI compute will not be for a long time (at least chance are) and AI compute will probably be on the latest node unlike gaming stuff.

To put it in a very extreme case, chance are non zero that NVidia will spent many tends potentially hundreds of millions of die on the next nintendo switch, the lowest of gaming we can think of. They will not be on TSMC N3 or the best Samsung node and will not directly compete in that way at least.

in 2024 expensive on a recent node for the high end market release, by 2025 not on the latest and greatest node anymore gaming card will be released, not in direct competition with the B100 for the high demand window.
The Next Switch is likely using a modified Jetson Orin Nano, that can be manufactured on the current Samsung 8N nodes which got revamped in 2021 addressing the issues Nvidia had with it originally.
The Jetson Orin Nano can also be done on TSMC 7 as the Ampere cards themselves were split between Samsung 8N and TSMC 7 depending on application, consumer got Samsung, while enterprise got TSMC.
But at TSMC 6 is just an advancement on 7, anything that could be fabbed on TSMC 7N can be done on 6N. But TSMC is expensive, so they can keep it on Samsung for now and shrink it down the line for 6N when that frees up in a few years. Much like the original switch did by going from 20nm down to 16nm.

Nvidia is putting out Enterprise Blackwell a year early and launching on TSMC 3N, where the consumer parts will be a year out still and on 4 because 4 is much cheaper than 3 and has much higher capacity available.

But I think and hope this is the start of Nvidia pulling a page from AMD's playbook and separating the consumer and enterprise silicon, AMD was very smart to do their RDNA and CDNA architectures separately, and between demands and bans, I hope Nvidia does the same going forward.
Should Nvidia separate them it makes the consumer stack easier, and they can stop with much of the fuckery they do to keep the consumer parts from being viable enterprise components because if the architectures are fundamentally different it's not like you could just firmware flash a 5080 into an RTX BJ 5000.
 
Not so sure about triple, but that level of demand for AI compute will not be for a long time (at least chance are) and AI compute will probably be on the latest node unlike gaming stuff.

To put it in a very extreme case, chance are non zero that NVidia will spent many tens potentially hundreds of millions of dies on the next Nintendo Switch, the lowest of gaming we can think of. They will not be on TSMC N3 or on the best Samsung node and will not directly compete with those product volume in that way at least.

in 2024 expensive on a recent node for the high end market release will be pure highest margin enterprise product, by 2025 not on the latest and greatest node anymore (by almost 2 full years by then) gaming card will be released, not in direct competition with the B100 for the high demand window of that product.
I agree with you... the X factor will be Intel and AMD really.

I agree that Nvidia is capable of designing very good gaming cards on an older node. (assuming they are actually willing to put engineering resources on that task) I think the competition is the X factor here. If lets say 3 years from now Intel has a new GPU on the market that is 10% faster then a 4090 super and AMD is 10-20% faster then that. Is Nvidia willing to dip into their good silicon pool to best them? The Nintendo is much like AMDs console sales... those are essentially a separate thing that has no real bearing on PC market GPUs. If Nvidias lower fab silicon can compete sure Nvidia is probably still a gaming company. However what if it isn't enough to compete really? Is Jensen ok with staying in a game market... where Nvidia isn't #1?

That is the way I see it anyway. If Nvidia can produce lower tier fabbed parts... turn out a 10-15% performance bump and RETAIN the #1 position. Sure I bet they stick around. However if the competitition is willing to use some of that top tier silicon and can best Nvidias lower tier silicon parts. I suspect Jensen pulls the plug. I can't imagine him staying around in a descrete GPU market where Nvidia isn't king.... and I really don't see him pulling even a token number of top tier Silicon to launch an always out of stock halo card. I could be wrong of course.. perhaps Jensens ego isn't as bloated as I assume and he could care less if Intel or AMD claim a performance crown, and will be happy to complete and sell 60 grade parts.
 
Accountability usually seems to be an afterthought when these types of things are implemented. Who's liable when this inevitably kills a noncombatant?

There's just too much faith in this technology (AI of any form) for me to feel remotely comfortable about relying on it.
Without going full soap box here, there already isn't any form of accountability. Technically the guy giving the orders is 'supposed to be' accountable. And we had more than a few discussions about this idea at the Nuremberg Trials (that you should dis-obey unlawful orders, or do the moral thing in the face of immorality, but frankly that isn't how any armed service operates). The reality is non-combatants are killed everyday and no one is held accountable outside of vigilantism.
 
Last edited:
Let's face it - an almost 7 year old 1080ti is still enough GPU to run upcoming games on high quality. About the only thing you're missing out on is ray tracing, and is ray tracing really worth the gigantic increase in cost that a current gen card costs?
Everyone has their own standards. No need to force anything on anyone.
 
Without going full soap box here, there already isn't any form of accountability. Technically the guy giving the orders is 'supposed to be' accountable. And we had more than a few discussions about this idea at the Nuremberg Trials (that you should dis-obey unlawful orders, or do the moral thing in the face of immorality, but frankly that isn't how any armed service operates). The reality is non-combatants are killed everyday and no one is held accountable outside of vigilantism.

The death of a non-combatant is a tragedy, but it isn't necessarily a crime.

If that death is intentional, or resultant from indiscriminate bombardment/shelling/shooting, then it very well may be a war crime.

There are a lot of grey areas, and legally it is pretty challenging to make a bullet-proof case, but the militaries of the west do their best to uphold these laws. Its not perfect, just like how civilian prosecution is not perfect, but you can tell the "good guys" from the "bad guys" by who is holding their own to account.

There are shitty people at all levels of society, and the military is not immune, but at least we try to prosecute those who violate the laws of decency.

For examples, you need to look no further than those court martialed for the Abu Graib Prison abuses, or the court-martial of that sniper Robert Bales who mass-murdered Afghan civilians. There was also the court martials of those involved in the Mahmudiyah rape and murders.

While these are the only cases that come to mind right now there are certainly others. (I'm just not an expert) So yes, there is a system to hold people to account. It's not perfect, but it does exist, and it does hold those responsible of verifiable atrocities accountable. Usually - however - the problem is the "verifiable " bit.
 
But kids these days are also not happy unless they are getting like 120fps-165fps, so I don't know. My take on what is considered "playable" may no longer be relevant here.

I'm not sure exactly what is going to happen in gaming, but games are starting to break through the 1080ti barrier, and eventually the kids are going to have to start buying new GPU's, get increasingly unsatisfying performance, or give up and go back to playing on consoles. Which of those actually happens is my best guess. I guess a little bit of each.
As I recall, the games the neighbor "kids" (17, 20, and 22) play are League of Legends, Counter Strike, and similar esports titles. I think a 1080ti should be able to max out a 1080p 144hz display on those games, and honestly, I don't think any of the 3 have a video card as good as the 1080ti either (I helped build their gaming machines). Unless they upgraded and I don't know about it, the oldest built his Christmas 2018 and got a 2070, the middle has a 2070 Super, and the youngest a 3060ti

In any case, though, you're right. 1080ti performance level is eventually not going to be enough to have satisfactory performance in most games, even if that takes the release of the next gen consoles. At that point, though, they can probably buy a 6060ti for $500 and call it good enough.
 
That start to be above the 1080ti (+17%), the 2070 super would be a tie, going by techpowerup ranking
There aren't a ton of reviews that contain both cards. The best article I know of shows the 3060 vs the 1080ti (https://www.techspot.com/review/2525-geforce-gtx-1080-ti-revisit/) with the 1080ti winning by about 3% on average. 1080ti wins in older games, loses in newer games, and the most popular esports games don't seem to be tested. Given this, the 3060ti will almost certainly win most comparisons, but not by enough of a margin that you'd replace the 1080ti with a 3060ti.

If we look at the requirements for the upcoming homeworld3 (https://forums.thefpsreview.com/thr...3-frame-generation-hdr-and-ray-tracing.14505/) you'll see that the 1080ti should still be able to run "high" settings.
 
Let's face it - an almost 7 year old 1080ti is still enough GPU to run upcoming games on high quality. About the only thing you're missing out on is ray tracing, and is ray tracing really worth the gigantic increase in cost that a current gen card costs?
This right here is the reason why Nvidia doesn't want to make GPU's. The cost of making a GPU faster now has diminishing returns. To make a GPU faster you need to consume more power, you need to generate more heat, you need to make the die sizes larger. This means the cost of GPU's need to keep up, but fewer people are now buying GPU's than before. This is why AMD and Nvidia haven't made a good replacement for the GTX 1060 and the RX 580, because in order to do so means selling the equivalent of a GTX 1080 Ti. Nvidia doesn't want to deal with this problem, so of course Nvidia has been expanding to other markets that aren't gaming.
I'm not saying PC gaming is dead. I'm saying for Nvidia the more high end GPUs they produce the more profit they leave on the table. Those same chips can be making them 10x as much right now. Even if by some miracle they tripled their Fab output that would still be true. Gaming isn't dying no... but Nvidia seems to be in a unique position where their chips have not just a little but like 10x the value in another market.
We've seen this before with Crypto, and it'll end like Crypto. With Nvidia having too much stock.
If Nvidia doesn't leave gaming. Their cards are going to get even more expensive. Worse if people think its bad that Nvidia sells a 4090 which would have been 80 level silicon last gen. Just wait till they start calling "5070" level parts 5090s next year.
Would be foolish for them to do so. At some point either AI's fad ends, or others will join in on the AI hardware sale. AI is so up in the air that nobody knows where it'll go.
 
Last edited:
We've seen this before with Crypto, and it'll end like Crypto. With Nvidia having too much stock.
Nah, enterprise GPU’s are made to order and Nvidia does trade-ins and buy-backs, those are sold before they leave assembly and rarely end up on the second hand market.
 
We've seen this before with Crypto, and it'll end like Crypto. With Nvidia having too much stock.

Would be foolish for them to do so. At some point either AI's fad ends, or others will join in on the AI hardware sale. AI is so up in the air that nobody knows where it'll go.
AI isn't crypto sadly. To many companies rolling out too many useful products. Crypto doesn't produce anything of value. AI is producing value. I know a few people working in AI... they aren't burning VC money or sucking up Crypto investor dollars. They are producing products that are saving companies insane amonuts of money. Two seperate companies I know of where friends are working... they started small got one client a few years ago. Now they are using the same models to score customers in completely different markets. I have a cousin at one company... he has been there 5 years now. First 2 years no cusotmers just him and a few other guys sweating in an office. 5 years later they can't hire sales and support fast enough to keep up with demand. The AI explosion is just starting.

As for other companies producing hardware. Yes sure they will. The problem is Nvidia is not just a AI hardware company. They are also a AI software company. To say they have a first mover advantage right now is an understatement.

Nvidia won't exit gaming tomorrow... but don't be shocked when it happens sometime in the next few years. I can't imagine a world where Nvidia only sells mid range graphic accelerators. It doesn't matter that they still sell 7 or 8 billion in gaming cards... the truth is right now up to 25% of those cards are being repurposed by China (and others) for AI use. Nvidias 4070 and up gaming silicon could easily be repurposed to at higher profit margins today. I think the chances of a blackwell generation of gaming cards is right about zero at the moment. Whatever comes after blackwell is going to be so heavily skued to AI work that it might be no better at Raster then what they have on the market today. (I suspect Blackwell won't be any better at raster work either... its heavilly skued to tensor and AI workloads)
 
Whatever comes after blackwell is going to be so heavily skued to AI work that it might be no better at Raster then what they have on the market today. (I suspect Blackwell won't be any better at raster work either... its heavilly skued to tensor and AI workloads)
:ROFLMAO: Come on, really now?
 
:ROFLMAO: Come on, really now?
You forget they skipped Volta consumer cards I guess. Volta was no faster then the previous gen in terms of raster. The blackwell design is almost entirely geared to AI workloads.

Here let me quote one of the "leaks" if we trust them for upcoming blackwell....
" The big news is that, like Ada Lovelace, Blackwell will be a quantum leap forward for Nvidia as it moves from TSMC's 4/5nm process to the much-hyped 3nm node. This could allow it to deliver a 2-2.6X performance improvement over its current GPUs, even with a monolithic design. The flagship GB102 is expected to offer just 144 SMs, the same as AD102. Therefore, the advancements will come not from throwing more hardware at the problem but from a new, more efficient design."

Let me translate that for you into reality. 2-2.6x improvement in AI workloads. Its going to have the same amount of Raster hardware as this gen. This is EXACTLY what they did with Volta. Volta had the same amount of transistors for Raster as the previous gen did... it used the die shrink to add tensor cores. Blackwell is using the die shrink to add more tensor cores. Raster improvements are not coming to Nvidia hardware anytime soon. (outside some super redesigns with a token efficiency bump)
 
You forget they skipped Volta consumer cards I guess. Volta was no faster then the previous gen in terms of raster. The blackwell design is almost entirely geared to AI workloads.

Here let me quote one of the "leaks" if we trust them for upcoming blackwell....
" The big news is that, like Ada Lovelace, Blackwell will be a quantum leap forward for Nvidia as it moves from TSMC's 4/5nm process to the much-hyped 3nm node. This could allow it to deliver a 2-2.6X performance improvement over its current GPUs, even with a monolithic design. The flagship GB102 is expected to offer just 144 SMs, the same as AD102. Therefore, the advancements will come not from throwing more hardware at the problem but from a new, more efficient design."

Let me translate that for you into reality. 2-2.6x improvement in AI workloads. Its going to have the same amount of Raster hardware as this gen. This is EXACTLY what they did with Volta. Volta had the same amount of transistors for Raster as the previous gen did... it used the die shrink to add tensor cores. Blackwell is using the die shrink to add more tensor cores. Raster improvements are not coming to Nvidia hardware anytime soon. (outside some super redesigns with a token efficiency bump)
I sort of have to agree with this one, I expect the new architecture and new node will net the usual ~15% performance improvements clock to clock for Raster, but it will have much greater performance with RT and such. That's fine though if you look at the features that are eating FPS it's all the stuff that has been offloaded to the Tensor cores, that's where the current bottlenecks are, turn off all those features and most titles perform just fine across the whole lineup with their intended resolutions
 
I sort of have to agree with this one, I expect the new architecture and new node will net the usual ~15% performance improvements clock to clock for Raster, but it will have much greater performance with RT and such. That's fine though if you look at the features that are eating FPS it's all the stuff that has been offloaded to the Tensor cores, that's where the current bottlenecks are, turn off all those features and most titles perform just fine across the whole lineup with their intended resolutions
That is true assuming Jensen is willing to sell that silicon into the low profit gaming sector.
The thing is if blackwell really is going to be providing 2-2.6x the AI performance.... there is no way in hell Jensen is dedicating even 3/4 functional parts to gaming. If those gains are even half true a 3/4 functional blackwell die is going to outperform their current $20k AI accelerators.
I have no doubt at this point Nvidia will ship 4090/80 super cards and that will be the only high end stuff they ship until a blackwell refresh. I mean who could even blame them at this point... its not like AMD or Intel is rumored to have some uber card that is going to release in that time and stomp the 4090.
 
The thing is if blackwell really is going to be providing 2-2.6x the AI performance.... there is no way in hell Jensen is dedicating even 3/4 functional parts to gaming.
It seems almost official that it will be 0 of the latest and greatest functional part going for gaming for all of 2024, so you are more than right about that.

After that you have probably again low volume only high-end launching first in early 2025, maybe again failed RTX 6000 successor and the xx80 on a smaller die, by the time the 60-70 series launch, there are chance that Hopper next will be a bit mature and out of the peak demands, A100 olds, H100 finally purchasable and the AI supply/demand under balance, let alone how much around the world good node fab could have started to be online in 2025.

With how much easy it will be to beat Lovelace and how tired they could get of Ampere in the field (having the 3060 and better being the norm in the PC field before the PS6 release could matter), it could go either way.
 
Last edited:
Back
Top