At least the US Government isn't just outright assassinating political dissidents. The 60's were rough, it wasn't your cellphone getting you killed it was being a capable enough person to organise people without the internet.
Snowdens basic point was the government *is* psychotic enough to burn...
I can seat the connecters fine, but if you put too much strain on them they just start to pop out. Why would people but strain on them? Cuz the cards are way out of spec physically and then the dumb ass connecters stick straight out even further. Its just one connector it could be pointing in so...
Crysis was fucked because it was designed for 5-6ghz single threaded cpu's that never eventuated and was hella CPU bound for a long time.
If I want to break my system I use Fortnite, it gets a new unobtanium setting every three months with Epic showing off the latest engine improvements.
16GB what? fuck off no way. You can put like so many of them in 4RU holy shit.
They'll be like hens teeth to buy.
And when you are handing out PCIex5 lanes who cares that its x8? That means it only takes four of my lanes.
Think new new version of this...
I think you all missed the part where DLSS allows nVidia to turn AI into Raster. Raster is stuck where it is, but DLSS will get really really really good. Or at least it will become the only game in town.
GPT 3 was trained on fuckin Voltas. The new new LLM's are yet to be built and the current ones are making massive undeniable accelerations to workloads that render you uncompetitive without them in some industries.
But the thing is while training is a bitch, if you build something with...
Apple didn't stop making Laptops and PC's. Witness the M3 Ultra!
But they're a phone company now. With a lil bitty PC department.
nVidia will keep making the most kick as gfx cards, but it will be a tiny department reusing stuff developed for phones. Datacenters.
The goal is models that give a majority of people a 30% boost in productivity without dimming the lights of the whole planet.
However, how we finally get there might leave nVidia looking like SGI, which would be ironic. Or Google looking like Yahoo.
And the VC's sure as shit can't tell one AI...
AMD could massively slash their GPU prices and go for market share.... but where would they FAB them?
The save the company, win the future moves are where you get 100x + mark ups, and those customers are buying every kind of accelerator as fast as they can be shipped.
I feel like the next...
AMD has figured out tiling GPUs on their M300. They can make any capacity APU they want now.
They can also focus on upgrading just the GPU tiles while a GPU i/o ship stays the same and the card stays validated so they can do much faster cycels.
But they got to get their architecture...
I dunno. GPT 4 isn't nearly as dumb as some of my friends, and isn't going to choose windows for a desktop for example. Or get knocked up by a dipshit. Or buy a house with a mortgage right before interest rates are obviously set to climb for decades.
The number of humans who actually change their minds based on explicit data is pretty small. Most go by their gut which is in fact making statistical guesses based on its training data.
But programming based on a gut instinct about which Function you should build next is a risk process, unless...
I used to haul around a 7 kglaptop with a 2kg power supply. It had the first dual core AMD chip, might have been a desktop part. raided hard drives. 17" screen. maybe it had sli gpus? I had the first model Dell 24" screen wide HD LCD with a handle I bolted to the top. I traveled 'door to door'...