Intel Pressuring Board Partners to remove features from Alder Lake via bios update for "Product Segmentation"

Just corporations doing what they do.Don’t like it.
Want change?
Stop supporting the board members wallets.
 
nobody’s cracking down on overclocking, they are running the silicon so close to the edge that the turbo frequencies basically do the job for you unless you start getting into the more exotic cooling solutions and yes an AIO or massive Noctua is an exotic cooler. What there is now is competition and neither party can afford to leave anything behind because that extra 100mhz could be the difference between it reviewing well or bombing.

They're actually running closer to the edge at stock than I normally do when I overclock something.
 
If making exact same hardware (and unlocking via license physical feature) make the supply chain cheaper for everyone I am not de facto against it at all ( a bit like Testla battery being locked down in software).

If the lock is pushed post sales that would be a different matter.
 
I haven't bought anything Intel in over ten years. Thank goodness that AMD is at least competitive with both CPUs and GPUs.
 
Well there was that whole pre-ryzen period of sub-par CPUs.
In the mid range they did just fine. Albeit hot. Frankly I'm still using one and it's chugging along better than its period competition, due to having moar cores. No noticeable difference in real world applications between it and first Gen ryzen.
 
Just corporations doing what they do.Don’t like it.
Want change?
Stop supporting the board members wallets.
The biggest problem with this is, I want their shit. I just don't like their methods. Yes I can complain about their shady shit, and just still buy whatever I want. At the end of the day I don't own a mega corp so I cant exactly relate directly to their situations.
 
Well there was that whole pre-ryzen period of sub-par CPUs.
Ah you mean the period when Intels CPUs just didn't bother with any working security. Its interesting if you bench those supposed terrible AMD chips today against Intel both with security issues fixed... Guess what. AMD holds its own. They might have even won in some of the value segments as the mitigations for most of those chips basically turn off Intels broken branch predictor on the mid range chips with low cache that damn near halves their performance in some cases its even worse then that. Ya ya Intel beat AMD up back for a stretch of 4 or 5 years by a good 30%... performance that mostly came from some Intel engineer actually deciding that their chips didn't need to check permission before doing cache mem reads. (which I still can't believe could have been done unknowingly... some engineer made that choice and either his bosses where too dumb to understand or choose to ignore it for the performance improvement)

Intel should prey to the chip gods with thanks nightly that no one discovered those flaws at the time. (that we know of... I mean who knows might have been part of state level tools for years)
 
Ah you mean the period when Intels CPUs just didn't bother with any working security. Its interesting if you bench those supposed terrible AMD chips today against Intel both with security issues fixed... Guess what. AMD holds its own. They might have even won in some of the value segments as basically turning off Intels broken branch perdition on the mid range chips would have damn near halved their performance.

Intel should prey to the chip gods with thanks nightly that no one discovered those flaws at the time. (that we know of... I mean who knows might have been part of state level tools for years)
C'mon that's a useless comparison. When they were just released is when we were comparing what to buy. Not something 7 years old at this point, since the vast majority of us have moved on since then.
 
This might be dangerously close to off topic, but ever since the Celeron 300A Intel has been cracking down on overclocking, tweaking, or otherwise stealing performance from them. The only reason we have overclocking and unlocked CPUs is because the competitive offered it to compete. Intel is a shady, multi national Corp that would sell your children to slavery in order to sell you a can of beans.
Incorrect. Intel has tried to prevent overclocking as far back as the Pentium days. There is a Pentium 133MHz stepping that won't run if not set correctly. However, despite Intel's efforts, everything from the 300A on up has been overclockable. The limitations we've generally seen come from certain architectures being less capable of clocking higher. This includes the latest silicon which is already binned at the edge of what its capable of. Meaning, there is no room to overclock. About all we've been able to do for some time now is lock the all core frequency to the boost frequency or close to it. Even then, power consumption skyrockets and you need heavy duty cooling to handle the consequences of that.
 
C'mon that's a useless comparison. When they were just released is when we were comparing what to buy. Not something 7 years old at this point, since the vast majority of us have moved on since then.
Is it a useless comparison.

Point stands. Intel will cheat and steal to win. They will steal from OEMs... they will force them into buy all or nothing, or buy X or you don't get Y, take our terrible power sucking laptop chip no one wants or will downgrade your access.

To make things worse... ya the Meltdown specter stuff proved they are also willing to cheat and steal from customers. That wasn't a mistake was my point. Intel knowingly built a broken prediction engine. They either built it cause they knew skipping basic security checks that IBM AMD ARM and every other chip with a prediction engine ever has done would give them a 30-40% performance boost. And or they built it at the behest of someone's Government (ok not someone's ours).... again so their clients could be compromised. Most security flaws you can look at and its easy to say ok someone missed that.... Intels prediction implementation was impossible to miss. No one who built that could possibly be that inept. University students with a C average wouldn't make that mistake. It was also a pretty key part of the design we aren't talking about one set of eyeballs that flaw had to have been seen by pretty much everyone on that Intel engineering team.

So ya those chips your talking about where AMD got the boots laid to them... its hard in hindsight to not see Intel was cheating. Its sort of like saying Lance Armstrong was the greatest cyclists and all his wins prove the competition just sucked.... accept that oh ya wait he was a cheat.
 
Despite this, if you were using AVX-512 as a consumer... it sounds like you can just stick with an older BIOS to retain the functionality.

Still, applications that leveraged it remain exceedingly rare, at least among those that are relevant to most consumers. This Anand thread was one of the only resources I could find that gave examples and it was basically just, x264.

There is a bigger question IMO about what Intel is doing in the HEDT segment. Most of the innovation they've brought to market has primarily benefitted consumer CPUs like 12th gen Core series, and I can't remember the last time mainstream manufacturers released compelling mobos for HEDT platform anyway. Couple years ago I switched from X99 to Z270 and imho, there's no looking back.
Lots of compelling parts for HEDT out there - it's just getting old on the Intel side (x299 has been around what, 4 years now?). I still run them because PCIE lanes (and occasionally memory bandwidth matters for my stuff), but ... yeah. Hoping HEDT isn't dead.
I expect it is going to take intel a couple generations to be where they would like with GPUs. They are going to get to parity though. With Intels past aborted attempts I believe there was a core of higher ups at Intel that honestly didn't see the GPU future. I think they honestly believed GPUs where never going to be more then a video game market thing which wasn't the big $$$, and at some point the CPUs would start doing real time ray tracing or something anyway. This time they clearly understand they need a GPU to provide 100% intel solutions for supercomputer/server/ai/auto ect ect.... so they will ride out the embarrassment of a couple of also ran status product cycles. (old Intel was too quick to abandon markets they didn't crush, too thin skinned to admit they where #2.... frankly they still sort of are their current CEO doesn't seem any different to me)
I've worked with/under Pat Gelsinger in the past. I have two rules in my career - never bet against Michael Dell, and NEVER bet against Gelsinger. Both are ambitious, driven, creative, and arguably brilliant in very specific fields. If Pat believes that GPUs are a market need for Intel, they'll get whatever funding and support they need (and a headman's axe to those not on board or pulling their weight) - if he believes that it doesn't support their core mission, that whole team will be cut lose without even a fare thee well. I suspect they see a need to break out of the core x86 business, hence the investment in fab lines (and licensing if needed), and the GPU market push. He's cutting things that don't make sense (Optane for consumer markets) and driving things that do (enterprise, GPU (which leads to specialized processors), fabrication). All things that are Intel's core competency.
For this first round of GPUs I don't think Intel will have the wiggle room to really pull the old one waffer... 5 skus ranging from -50% margin to 250%. The GPUs are outsourced... I mean they will obviously not price the bottom of the wafer at the top, I just don't think they can get away with the shenanigans of the past as easily having to more closely account for wafer supplier payments. At this point Shareholders are going to want to know yields... and aren't going to be ok with "lasered off" (or disabled in firmware) chunks of silicon that is being sold at cost or worse under cost. When Intel moves the GPU silicon to their own fabs though.... that might be very different. If Intel can ever sort their fabs out, then I expect a lot of OEMs to be stuck with the old you buy our CPU you buy our GPU for $1 or perhaps we don't have enough CPUs to fill your order BS. At that point Intel covers those OEM deals up in NDAs, reports the sales of CPU and GPU as "computing hardware unit" sales on the quarterly reports and everyone is happy. (same way companies like Microsoft have hidden Xbox losses for over a decade)
We'll definitely see. Not as worried about margins on individual products (especially early on), but definitely curious about the bundling/etc.
nobody’s cracking down on overclocking, they are running the silicon so close to the edge that the turbo frequencies basically do the job for you unless you start getting into the more exotic cooling solutions and yes an AIO or massive Noctua is an exotic cooler. What there is now is competition and neither party can afford to leave anything behind because that extra 100mhz could be the difference between it reviewing well or bombing.
This. I haven't touched it on my 10700K because the board exceeded my planned OC on its own.
 
The limitations we've generally seen come from certain architectures being less capable of clocking higher.
Oh, so Intel making multipliers unavailable in BIOS when they previously were, and the competition allowed it isn't a limitation then.....
 
Oh, so Intel making multipliers unavailable in BIOS when they previously were, and the competition allowed it isn't a limitation then.....
They have for a very long time now with "k" cpu lines. However there's little to gain on amd or Intel really from manual overclocks anyway.
 
They have for a very long time now with "k" cpu lines. However there's little to gain on amd or Intel really from manual overclocks anyway.
Those only exist because they were forced. And you pay more for the privilege.
 
Back
Top