Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Just kidding.... But it's fun to kick sand at Intel and Ngreedia, both companies have proven to me time and time again they don't give a flying fuck about the end user. Intel cares about DELL, Lenovo, etc (bulk sales). And if you pre-ordered anything from Intel or NVIDIA, you get what you deserve. Intel should rename it the 9900k FX.
I wouldn't be too happy sanding 2mm from the die, I would be worried about going right through to the silicon.
*80%, and significantly shorter maximum frametimes on top of that .
Yup, shorter frametimes at 1080P, with what is most likely a 1080Ti.
Switch that to 1440P or 4K and that frametime difference gets thrown out the window.
Not really?
And when you upgrade GPUs and new games come out, there it is again!
My God people have zero understanding of this stuff. Go read what Kyle has written about testing at 1080p. And if you're still confused, read it again.
From a real game experience, it hardly matters.
I'm going to agree with the majority of your post, I just want to clarify something a bit: I'm not looking at the 2080Ti, but what comes after that. People keep CPUs for longer than ever, and that's part of my point. You're getting two extra cores with class-leading per-core performance, and yeah, it costs a few bucks extra- but you also don't need to upgrade it as soon.
My own experience has taught me not to future proof beyond what I see the life of the system (CPU+GPU, 4 years).
I could do this. I can dial in .001" increments. hmm......Get access to a surface grinder
Build a cpu mounting fixture
Sacrifice a cpu to find correct depth of cut
Offer service to Silicon Lottery
Profit
Yes.can these be delidded ?
Yes.
Intel isn't going to waste silicone or engineering hours on something you don't need. The 9900K gets hot as hell under the lid, especially overclocked, as the reviews show. Perhaps the material is needed to limit the damage to the CPU over time.
I don't inherently disagree- this is why I made the mistake of getting a 2500k instead of a 2600k! Damn thing ran a GTX570 at the start, and ended with a pair of GTX970s...
i see they can but does the sauder stick to the IHS and the intel DIE?
I already failed one bet, that was it would take Intel ~1 year to respond to Ryzen, they managed it in <6 months.
You know what he meant.What is sauder?
So Intels new chip runs hot and even though they went back to solder, it still needs a delid, liquid metal, and now you have to sand the die and ihs?
So basically unless you're going to move to LM, there's no point delidding.
it still needs
At the edge of it's thermal capacity?You'd be very likely to get 5.0GHz on eight hyper-threaded cores, perhaps with a small AVX offset. I can't support the claim that it needs anything, given that it's at the 'edge' already.
At the edge of it's thermal capacity?
To the best of my knowledge, the topmost silicon layer of these reverse-package chips are usually known as the de-stress layer. They are there to mechanically cope with differential thermal expansion under load. If this layer isn't specified correctly in thickness, the silicon die can eventually crack, detach or delaminate internally. I remember this was an issue with some earlier generations of GPUs like the 9000-series from ATI, when the GPUs would eventually delaminate internally if bigger than original coolers with higher clamping forces were added by the users.
I wouldn't be too hasty to sand off the top layers, unless it is for experimenting purposes like Der8auer. I suspect Intel put that thicker PCB and de-stress layer on top because the 8 die elongated silicon setup is particularly prone to thermal stresses.
At the edge of physics, really.At the edge of it's thermal capacity?
That seems like a viable explanation, too. I don't think this iteration had a die shrink, though.
Were the silicon die thicker prior to the move to TIM? Not sure if that can be measured - TIM came a long time ago now and CPU silicon has changed - but it would be interesting.
With the thicker silicon and die, I wonder if direct-die cooling will rear it's head again. If the delidding and sanding is as straight forward as it looks in the video, I could see practicing on an i3 first before breaking apart the i9.
This architecture is at its end. Intel has pulled all the stops to give it one last hurrah (the hurrah that should have been the 8086K, in my view), but really it's done. You can see how poor it's scaling now and how much power it needs to get there. It has vulnerabilities they're never going to fix in hardware. Ringbus is awesome but it's showing cracks, too.. They're looking forward to XYZ-Lake in H2 2019 and beyond with major changes.This thing simply can't be refreshed any more.Adding insulation to 'limit the damage' from heat? What a load of bullshit.
This is Intel, they made it thicker so they can artificially squeeze out some additional performance from the next refresh.
Indeed it is the cpu die itself that he sanded. He insists that the circuits are on the bottom of the silicon, so taking some off the top won't kill it.
What is sauder?