Intel Plans To Have Spectre & Meltdown-Proof CPUs This Year

They've got to have broken some laws here. They knew about their cpu flaws ahead of their latest release and went to market regardless. They basically withheld this information and sold flawed cpus. Now they will release "new" cpus w/o the flaw instead replacing all the flawed cpus they knowingly sold. WTF?
 
All Intel and AMD CPUs sold have bugs / flaws. If we had to wait till the CPU was bug free updates would be many years apart.
 
Last edited:
[...]That's hyperbole. Gaming performance wasn't impacted. Most software wasn't impacted. The programs that were impacted are programs that request a large amount of information from the hard drives/SSDs. The higher performance the storage solution, the greater the impact. The typical home user won't see any impact, but data centers will see huge impacts. Most people running SATA3 SSDs or slower will see little performance impact.

It's anecdotal, but I have noticed a number of things on a laptop I use for work, but sometimes for typical home tasks. It's a HP EliteBook 840 Gen2 with a 5600-U CPU and SSD storage. It's running Windows 7.

1. My work, which is very far from super data intensive, or related to server tasks, suffers. I build projects in Visual Studio very frequently, and they're large-ish. I lost about 20-35% of performance, depending on the project and the build options. That is no small impact. And sure, the build process reads/writes lots of small files, but it's not quite the "only data centers need to worry" line Intel likes to push.

2. Screen draws and updates are sometimes noticably slower. We're not talking factor 10, but when I test GUIs with large amounts of buttons, boxes, lists etc I can see a difference in performance before and after. The laptop uses the GPU that's built-in to the 5600-U, not a dedicated 3rd party chip.

I have gone back and forth between the new BIOS (the microcode Spectre fix really is making the difference, not the Windows patches) and I have tested this multiple times. The performance impact is VERY obvious to me.

Regarding gaming, I have no experience on this machine, but I would just like to say that while some benchmarks go one way or the other, we have far from benchmarked everything. I am sure there will be outliers, and when you like one of those games, well then this really sucks. That Intel and a lot of 'tech sites' dismiss it means little to that user. I really dislike the "most users..." attitude. Perhaps you're right that a lot of people won't feel this, perhaps most. To me as an enthusiast and heavy user, that doesn't matter. I lose a large amount of performance, that's all I care about.

I'm not one screaming "refunds! replacements!" from the roof top, but you know what, since it's mostly Enthusiasts feeling the pain, why can't Intel throw us a bone and say "OK we messed up on some of the engineering. Here, 20% off your next CPU purchase if you have proof of ownership of a Skylake and up CPU. You know, around the time we should really have fixed this already."
 
Considering AMD finally has a strong performer with Ryzen, and Intel hitting the shady button on it's entire customer base, this is easily the most convincing time in over a decade to consider AMD for really any type of computer purchase.
 
One would think this is a priority for Intel, as in don't sell anything until you've fixed it type priority. However it's still business as usual on the Intel front until something convenient comes along.
 
At least I'm not the only one running that generation. Still limping my i5-2500k along.

The Sandy Bridge was one of Intel's best gens. Nothing wrong or "limping" about them. In fact, Intel made some serious mistakes (that were good for the consumer) on that line.
 
The Sandy Bridge was one of Intel's best gens. Nothing wrong or "limping" about them. In fact, Intel made some serious mistakes (that were good for the consumer) on that line.
I'm gonna keep using my 3970X for a while still. Next purchase will probably be Zen 2 or similar on the AMD side.
 
I just want the best value for whatever amount of money I'm going to spend. I don't really care if there is only a 5% difference. That still makes one better than the other depending on price.

Everyone is assuming that AMD wouldn't be handling this problem exactly the same way Intel is if their places were swapped. I think that is a pretty big leap of faith.

I assume nothing.
It's appearance and how each company carries themselves and responds to situations.
I have faith that AMD would have handled the situation a LOT BETTER than Intel has.
Remember, we're talking about Intel that took to underhanded practices for years to screw AMD over, and has been selling 'upgrades' that are minuscule performance gain for the last, what five years; conservatively?

This stuff matters and when you put it all together...it easily sways those with open minds or a specific dog in the race (like security).

If I ran an Intel CPU, I'd be OK with the vulnerabilities because I build a PC for gaming only.
No social media accounts, no bank accounts, not even Google/Gmail or Amazon.
Most someone could glean would be Steam and various other gaming services, each of which have two-factor auth (2fa) if available.
It's a decision, for exactly this sort of issue - security - and I go to great lengths to adhere to it, use 2fa, etc.

What's bobbing around my mind right now is, my work PC I do log into Amazon and Google/Gmail, do I need to rethink that policy since I have an affected i5-2320 in my client? Even with 2fa enabled for Google, I try to always be as secure as I can possibly be.


I don't love the socket changes as well but I don't see why AMD gets so much credit here.

It's easy to keep a socket around forever when in 6 years you only released bulldozer and piledriver. That's not really any better than haswell to devils canyon. Both lasted 1 refresh. AMD just released a lot fewer CPUs....

AMD could have gone into the designs planning on exactly that, do we know any better?
Sounds like justification talk to me.
The fact is, Intel is not doing that now or with their current roadmap.
So kudos to AMD for either planning that from the get go, or taking the pulse of the market and listening to people complain about all the socket changes and adapting accordingly.
That's good business, assuming the performance of the product in question doesn't suffer.
That's the credit they should be given, whether they released less CPUs is irrelevant to this specific point.
 
The Sandy Bridge was one of Intel's best gens. Nothing wrong or "limping" about them. In fact, Intel made some serious mistakes (that were good for the consumer) on that line.

You're right, but it's starting to show it's age. There is a definite limp showing up.
 
I'll give credit to Microsoft for canning the CEO after the shitshow which was Win8/8.1, because almost zero enterprise customers adopted it due to the radical GUI changes. At least Microsoft listened and learned from that, and brought out an successor that was a breeze to get users to easily transition to from Win7.
When is this version coming out! I want it!
 
I have faith that AMD would have handled the situation a LOT BETTER than Intel has.
Remember, we're talking about Intel that took to underhanded practices for years to screw AMD over, and has been selling 'upgrades' that are minuscule performance gain for the last, what five years; conservatively?
.....

So kudos to AMD for either planning that from the get go, or taking the pulse of the market and listening to people complain about all the socket changes and adapting accordingly.
......
That's the credit they should be given, whether they released less CPUs is irrelevant to this specific point.

It's all relevant to the point. AMD didn't do this because they love their customers or are intouch with the market. They did it because they did not have a competitive processor. If AMD could have released a faster CPU on a new socket 3 years ago they would have.

It's hard to hate intel for releasing miniscule upgrades when AMD was offering no upgrades. They literally had nothing better to offer even if they wanted to.

If you think AMD or Intel care about anything other than profits you are crazy. Unfortunately this is how publicly owned companies work. Having faith that AMD would have handled things any better is about as logical as having faith in God.

You keep having faith, I'll keep buying the best equipment for my money, regardless of manufacturer.
 
You're right, but it's starting to show it's age. There is a definite limp showing up.

True, but we're talking a 10-20% uptick at best between Sandy Bridge and Sky Lake (real world). Just saying. Now, architecturally, sure, we've got USB 3, faster ram... etc...
 
So, Apple didn't tell iPhone users about the throttling, and they get sued for it because they robbed choices off the iPhone users.
Reportedly, Google found the two major bugs and informed of Intel on 6/1/2017.
Google’s engineering teams began working to protect our customers from these vulnerabilities upon our learning of them in June 2017. We applied solutions across the entire suite of Google products, and we collaborated with the industry at large to help protect users across the web.

https://www.blog.google/topics/google-cloud/answering-your-questions-about-meltdown-and-spectre/

That's Google. Like the OP said, it takes years to design and optimize a CPU before going into the production. Intel knew about these bugs way before June 2017.
 
Maybe they plan on giving everybody who has an Intel CPU a HUGE discount on a new one? :)

O97G1_s-200x150.gif
 
I am seriously looking to trade my computers guts out with an AMD rig because I know I personally won't go Intel again regardless of a few% speed difference in benchmarks.
Ran amd in the day with a clawhammer x72 and I feel they got it back with Zen..... I just needed a nudge and instead I got thrown out of the car...
Thanks Intel ;/
 
I'm curious if this is the proposed Z390 chips and a CFL refresh, or something we've yet to hear about.
 
I don't have much care for my home pc with this honestly, (there is still no updated bios for my board). What I worry about is the servers on the office. especially the new batch of servers we just acquired last year ffs.
 
Yeah Q4 sales werent really affected by Spectre and meltdown. Q1 is gonna hurt them, I dont think 2018 is going to be as kind to Intel as 2017 was, slip slidin' away...
I doubt it will matter that much. Companies are going to move to Windows 10 over the next 2 years, and I seriously doubt they'll delay new H/W (which may have already been delayed because of their migration plans). But if I'm wrong, it still doesn't really matter. If Pfizer delays new H/W for 6 months, they're still buying Intel CPUs.

Now for peeps like you and I, I completely agree. I was thinking of getting the new Dell XPS 15 in a couple of months, but I'll probably wait until they come with the revised CPU.
 
It's all relevant to the point. AMD didn't do this because they love their customers or are intouch with the market.

Actually that last part is arguably the exact reason that AMD hasn't changed sockets frequently. AMD got a lot of flack for releasing AM2 right after 939 and then prematurely killing off 939 even though they were the same platform other than supporting different memory. As a result of that backlash they made a point of promising that AM3 would continue to support new CPUs for some time and have obviously continued that philosophy.

It's possible that this approach has held AMD back some but Intel has gone the opposite route and has changed sockets when there has been no justifiable reason to do so.
 
It's any information in memory, just passwords / keys are the most obvious target.
Understand, but, there are many types flaws in CPUs, see any Intel Errata Summary Table for examples. Those flaws don't allow for the reading of privileged memory.
 
So should I wait even longer and not build an 8700k system? Maybe the next Gen will be 8 cores... Maybe by summer time, just after the new 11 series nVidia cards are available...
 
Back
Top