Linus: You Can't Trust Apple's Performance Claims

Zarathustra[H]

Extremely [H]
Joined
Oct 29, 2000
Messages
39,004
Sure, Apple's M1 and M2 are pretty cool, but where do Apples claimed performance numbers come from anyway?

Linus tackles this:



Marketers will, uh, Marketeer I guess.

Not usually a fan of Linus, but I have to admit, this is a decent expose.

He does give Apple high marks for power use though, which is well deserved.
 
Let's all laugh at an industry that never learns anything Tee Hee hee. This is what happened to Apple's G4's and G5's when they started to lose to Intel. Just wait for the Ryzen Phoenix CPUs to get benchmarked. I'm sure the Apple marketing fud will really get nutty.
macuser.jpg
 
Comparing to a custom build would also be harsh on an HP z/Lenovo workstation once you go anything higher than baseline ram-harddrive, for $4000 you do not get something better on:

https://www.lenovo.com/us/en/configurator/cto/index.html?bundleId=30E0CTO1WWUS2
or
https://www.dell.com/en-us/shop/wor...rationid=156e6772-7d48-4bed-b88a-e2448e8978a4

Going from 32 to 128 RCC cost $1400 to 1800 here depending of the wanted bandwith

That more a custom vs prebuild affair and I am sure larger buyer get much better deal than what is available online ( I suspect about no one pay those ram-drive price, instead of adding them themselve if needed so the actual cost when you talk to sell people goes down if they are included), like laptop it does not feel like particularly bad price in that market.
 
Sadly Apple's level of attorney power will keep anyone from suing other than a state/country.
 
Not sure why anyone would be surprised, this has ALWAYS been how Apple is. Even when they have a powerful product that does well, they still can't help themselves but to overstate its performance. It just always seemed like one of those weird quirks of Steve Jobs that he couldn't just have something that was good, it had to be the best, of all time, ever. Maybe it wasn't him though since they are still doing it, or maybe it has just been institutionalized at this point.

Either way never trust Apple's benchmarks. I mean never trust ANY company's benchmarks, but particularly not Apple. They always over state shit and have for decades.
 
5x is bullshit. But out of the gate, PowerPC had basically everything over x86.
Not really. Out of the gate it was competitive, if you were running native code (which Apple wasn't) but it wasn't some amazingly faster tech. However, the Mac fanboys piled on the supposed coming clock speed increases as to what would make it worth it. They claimed that PPC had a positive second derivative (growth of growth) of MHz and x86 had a negative second derivative. So while it wasn't completely trouncing Intel YET it would soon as code went native and PPC's MHz ran away and Intel's stagnated.

Of course, then the opposite happened, x86 had good clock speed scaling, PPC stagnated so suddenly they started talking about "the MHz myth" as though they hadn't started it and how clock speed doesn't matter (it does, it isn't the only thing that matters but it matters) and so on.
 
Not sure why anyone would be surprised, this has ALWAYS been how Apple is. Even when they have a powerful product that does well, they still can't help themselves but to overstate its performance. It just always seemed like one of those weird quirks of Steve Jobs that he couldn't just have something that was good, it had to be the best, of all time, ever. Maybe it wasn't him though since they are still doing it, or maybe it has just been institutionalized at this point.

Either way never trust Apple's benchmarks. I mean never trust ANY company's benchmarks, but particularly not Apple. They always over state shit and have for decades.
Apple's "reality distortion field," made famous by Jobs himself.
 
Not really. Out of the gate it was competitive, if you were running native code (which Apple wasn't) but it wasn't some amazingly faster tech. However, the Mac fanboys piled on the supposed coming clock speed increases as to what would make it worth it. They claimed that PPC had a positive second derivative (growth of growth) of MHz and x86 had a negative second derivative. So while it wasn't completely trouncing Intel YET it would soon as code went native and PPC's MHz ran away and Intel's stagnated.

Of course, then the opposite happened, x86 had good clock speed scaling, PPC stagnated so suddenly they started talking about "the MHz myth" as though they hadn't started it and how clock speed doesn't matter (it does, it isn't the only thing that matters but it matters) and so on.
Clock-for-clock the PowerPC 601 66MHz was more powerful overall than the Intel Pentium 66MHz.
I still have a system running a PowerPC 601 100MHz CPU and it is more in line competition with a Pentium 150MHz, give or take.

I agree with the second part, IBM did not want to continue to invest into the PowerPC 970MP, which featured in the Apple G5 Quad in 2005, and even Intel's 32-bit Core CPUs were much more powerful and power-efficient by 2006.
 
I'm not one for brand loyalty, but I'd never buy anything apple even if it was the only game in town.

The question is not even are they screwing the customer, but how badly, and in how many ways.
Isn't "I will never buy X" just as bad as "I will only buy X?" I'm not a fan of dealing in absolutes, and it's a bit optimistic to think that your favourite Windows or Android hardware brands aren't screwing you over at times. I'm mainly an Apple user, but I know what Android phone and Windows PC(s) I'd buy if Apple got on my nerves.

I'm not apologizing for Apple's murky benchmarking, but Macs are still good computers in the right situations. A Mac Studio is an excellent audiovisual editing rig, especially if size, quiet and power efficiency play any part in your decision. Where it struggles is brute force; even though the M2 Ultra delivers a ton of power per watt, the fact is that a conventional desktop CPU with a monster GPU can throw enough watts and large chip dies at a problem to come out on top in at least some situations.
 
Isn't "I will never buy X" just as bad as "I will only buy X?" I'm not a fan of dealing in absolutes, and it's a bit optimistic to think that your favourite Windows or Android hardware brands aren't screwing you over at times. I'm mainly an Apple user, but I know what Android phone and Windows PC(s) I'd buy if Apple got on my nerves.

I'm not apologizing for Apple's murky benchmarking, but Macs are still good computers in the right situations. A Mac Studio is an excellent audiovisual editing rig, especially if size, quiet and power efficiency play any part in your decision. Where it struggles is brute force; even though the M2 Ultra delivers a ton of power per watt, the fact is that a conventional desktop CPU with a monster GPU can throw enough watts and large chip dies at a problem to come out on top in at least some situations.
Excluding one brand still gives you plenty of choice, restricting yourself to one, doesn't give you any. I have no favorite Android manufacturer, in fact I never owned the same brand of smartphone twice.

I'd be perfectly happy to reverse my decision if suddenly apple becomes a consumer centric and fair company. But until I see evidence of that happening, I'll keep away from their overpriced fragile devices and awful after sales treatment.
 
Isn't "I will never buy X" just as bad as "I will only buy X?" I'm not a fan of dealing in absolutes, and it's a bit optimistic to think that your favourite Windows or Android hardware brands aren't screwing you over at times. I'm mainly an Apple user, but I know what Android phone and Windows PC(s) I'd buy if Apple got on my nerves.

I'm not apologizing for Apple's murky benchmarking, but Macs are still good computers in the right situations. A Mac Studio is an excellent audiovisual editing rig, especially if size, quiet and power efficiency play any part in your decision. Where it struggles is brute force; even though the M2 Ultra delivers a ton of power per watt, the fact is that a conventional desktop CPU with a monster GPU can throw enough watts and large chip dies at a problem to come out on top in at least some situations.
In grammar school we had Macs. They were fine in a school environment. At one job I was issued an Iphone. It was fine for email and calls. I've used both over the years. I just don't like Apple's business practices or marketing. I don't like Nvidia's either for that matter but when I needed a graphics card I was able to get my grubby hands on one from Nvidia.

I don't need Apple products. Anything I need to do can be done on Windows (which I prefer) or Android. It would be different if I needed Apple products for some reason but I don't. So I can avoid them.
 
Excluding one brand still gives you plenty of choice, restricting yourself to one, doesn't give you any. I have no favorite Android manufacturer, in fact I never owned the same brand of smartphone twice.

I'd be perfectly happy to reverse my decision if suddenly apple becomes a consumer centric and fair company. But until I see evidence of that happening, I'll keep away from their overpriced fragile devices and awful after sales treatment.
It's still trading in absolutes... and if I'm honest, it sounds like you're putting some blinders on.

I don't think Apple is going to suddenly build fully modular ATX desktops. The switch to ARM makes that difficult, and it's not really Apple's MO. But they're not all overpriced, and I'm not sure I'd say fragile, either. Hell, the new MacBook Pros are relative tanks.

I can't comment on Apple's current after-sales support, but... I also haven't had to use it for several years.
 
Wait.

People are telling me that a marketing department isn't a reliable source of accurate technical information about a given product?

What is the world coming to.

New generation learning old things. Maybe itll change things, we all just grew up knowing commercials lied to us and didn't expect anything else. Maybe the next group will push to make it not allowed to lie, could be a good thing. (shrugs).

Wheres that meme of the 486 desktop pc with the sticker "never obsolete" on it? lol
 
It's still trading in absolutes... and if I'm honest, it sounds like you're putting some blinders on.

I don't think Apple is going to suddenly build fully modular ATX desktops. The switch to ARM makes that difficult, and it's not really Apple's MO. But they're not all overpriced, and I'm not sure I'd say fragile, either. Hell, the new MacBook Pros are relative tanks.

I can't comment on Apple's current after-sales support, but... I also haven't had to use it for several years.
Don't move the goalposts, you said it is as bad as, which clearly it isn't as I've described. Yes I absolutely don't want to buy an apple product now, but if they'd change their practices and reputation I might in the future.

How are they not overpriced when their cheapest model costs more than 2x as much as a similar size but much better specd notebook PC by other brands? And the models I'd even consider to be suitable for me start at 2x of what I'd be ever willing to pay. Same for their phones.

As for being fragile it doesn't literally means the device shatters like glass, it means the hardware is not designed to be robust, or designed to be easily repairable, in fact it is the exact opposite of that.

I never had to use their customer service either because I never even had an apple product, but I can read all the horror stories online. That are strangely always about them and rarely the other widespread brands. They say the wise person learns from other people's mistakes, while the fool doesn't learn even from their own.
 
Isn't "I will never buy X" just as bad as "I will only buy X?" I'm not a fan of dealing in absolutes, and it's a bit optimistic to think that your favourite Windows or Android hardware brands aren't screwing you over at times. I'm mainly an Apple user, but I know what Android phone and Windows PC(s) I'd buy if Apple got on my nerves.
It's not as clear cut as I'll only buy X, because X has infinite choices. For example, as an Android user if HTC were to piss me off by making me go to their website and filling out information to unlock a bootloader, just so I can install whatever OS I want on my phone, then maybe I won't buy HTC. Maybe I'll buy from Motorola or LG, or any number of Android devices being sold. Where as with Apple if I only buy Apple and lets say Apple won't let me repair my iPhone, then exactly who can I go to as an alternative for iOS?
I'm not apologizing for Apple's murky benchmarking, but Macs are still good computers in the right situations.
Broken clocks can be right twice a day, but that doesn't mean they aren't fundamentally broken.
A Mac Studio is an excellent audiovisual editing rig, especially if size, quiet and power efficiency play any part in your decision. Where it struggles is brute force; even though the M2 Ultra delivers a ton of power per watt, the fact is that a conventional desktop CPU with a monster GPU can throw enough watts and large chip dies at a problem to come out on top in at least some situations.
The problem with this is assuming you build a monster machine like Linus Tech Tips did. The i9-13900K is infamous for drawing a lot of power which can be 413 watts. The Ryzen 9 7950X will use 250 watts under load, which is nearly 200 watts less compared to the 13900K. While a Radeon RX 7900 XTX will use 360 watts while gaming , the RTX 4090 can use as much as nearly 470 watts when properly loaded, but with gaming it's around 350 watts. So right from the start Linus had setup the PC to fail in terms of power consumption, when a comparable AMD build could perform nearly as well while using anywhere from 30% to 40% less power. A 13900K with a RTX 4090 might use almost as much as 800 watts under max load, while a 7950X with a 7900 XTX might use as much as 600 watts. That's not including AMD's driver updates which have since reduced power usage. Another thing to keep in mind is that this is all in theory because the power tests were done in extreme cases. The CPU power tests I linked were using Cinebench, and Linus's own graph for the 13900K was using Prime 95. Both the 4090 and 7900 XTX were able to draw as much power as they did using Furmark. Linus's quick power test for the 4090 was done with MSI Kombustor, which is just Furmark. While for the power test on the Apple’s Mac Studio M2 Ultra got 331 watts using Blender. When looking at power consumption on these systems using Blender like Linus did, it looks less impressive for Apple. The ryzen 9 7950X total system power usage is around 330 watts, just like Apple when using Blender. The 13900K will use nearly 500 watts. Here are PC Worlds results.


Hard to tell if they were using both the CPU and the GPU for these tests since they were mostly focused on CPU, but this is total system power draw while using Blender. The assumption that a system will use as much power as it can for a single application is not correct, and Linus I'm sure is aware of it, but he certainly did paint that scenario. Especially when he didn't even bother to run a total system power draw test on the 13900K with the RTX 4090, but he did set it up to make you assume that it can draw up to 800 watts. Which can happen if you run Prime95 and Furmark at the same time, but anyone who knows anything about these tools is that they aren't realistic in terms of usage. So again, 331 watts from the Apple Mac Studio is not impressive.
1_Power_Blender.png
 
when a comparable AMD build could perform nearly as well while using anywhere from 30% to 40% less power
Not for all the benchmark they did:

AMD-Radeon-7900-XTX-24GB-Blender-Score.png


Does Octane or RedShift even work on an AMD GPU ?

it just happened for RedShift apparently:
https://www.pugetsystems.com/labs/articles/redshift-adds-amd-gpu-support/
but the gap is massive:
redshift_rtxon-3.png


188s is still faster than the M2 ultra (287) but more in the same ballpark, with some exception (resolve) and isn't efficacy wise not really an issue anyway, Lovelace competing really well if not beating AMD this round having a bit of a better node with TSMC to significant better node for what is done on 6nm on the big RDNA 3 card.
 
It's still trading in absolutes... and if I'm honest, it sounds like you're putting some blinders on.

I don't think Apple is going to suddenly build fully modular ATX desktops. The switch to ARM makes that difficult, and it's not really Apple's MO. But they're not all overpriced, and I'm not sure I'd say fragile, either. Hell, the new MacBook Pros are relative tanks.

I can't comment on Apple's current after-sales support, but... I also haven't had to use it for several years.

They could start by not soldering the RAM to the board, and not using proprietary NVME drives that aren't user upgradeable.

I'm sure they could change their ways, but as long as their products are not used maintainable and upgradeable they are not for me under any circumstance.
 
They could start by not soldering the RAM to the board, and not using proprietary NVME drives that aren't user upgradeable.
The first is sadly necessary due to the short traces needed for DDR5 to function the way it does at those speeds, and the second I agree with and is BS.
 
They could start by not soldering the RAM to the board, and not using proprietary NVME drives that aren't user upgradeable.

I'm sure they could change their ways, but as long as their products are not used maintainable and upgradeable they are not for me under any circumstance.
Every OEM does this though, with the claim of "it adds to reliability and stability of the system and lets us make things even thiner!" vs the truth as we know it "I will buy your min spec device and upgrade the ram myself to 32Gb for 1/10th of the price they want to charge us!
 
They could start by not soldering the RAM to the board, and not using proprietary NVME drives that aren't user upgradeable.
As noted by Red Falcon, the RAM is part of the CPU package. Which AMD as an example has also started to do on the top of their product stack:
https://hardforum.com/threads/amd-outs-genoa.2029204/
Doing it this way has many performance advantages - most notably in the case of Mac, the CPU and GPU sharing the same RAM pool meaning that there doesn't have to be duplication in memory and there is direct access to memory via both the CPU and the GPU. And it has the notable disadvantage of not being repairable.

In the case of AMD's Genoa there is the option to add RAM, though it's yet to be seen how problematic that will be with a lot of its intended workflows, and hence has a die only RAM mode, a system only RAM mode, and a combo both mode.

Yeah, lack of un-upgradeable NVME also sucks. But FWIW the new M2 Studio (not the M1 Studio) and Mac Pro both have user serviceable drives, though it's proprietary so at this time only Apple parts can be used to replace/upgrade it. Is that useful? Likely not at all, for anything other than failure anyway.

Part of the reason why it's proprietary is because the SSD controller is also part of the CPU system package - as it is responsible for all the on the fly encryption and compression/decompression. Whether it should be or shouldn't be is up to debate. But certainly, theoretically other flash manufacturers could make the same proprietary NAND sticks for less money - if the Studio as an example allowed for un-Apple-signed sticks.
I'm sure they could change their ways, but as long as their products are not used maintainable and upgradeable they are not for me under any circumstance.
We've talked about this at other times. And while we 'disagree' I think we definitely agree that Apple isn't and likely will never be for you. And that's fine.
 
Last edited:
Every OEM does this though, with the claim of "it adds to reliability and stability of the system and lets us make things even thiner!" vs the truth as we know it "I will buy your min spec device and upgrade the ram myself to 32Gb for 1/10th of the price they want to charge us!
DDR5 has changed the game due to architectural changes between it and DDR4.
The excuse of thin thin thin is no longer a thing, and the requirement of embedded DDR5 SDRAM chips is actually necessary at this point due to signaling and latency, especially at higher speeds and transfer rates.

The embedded NVMe is nonsense, though, and should absolutely be able to be upgraded.
 
DDR5 has changed the game due to architectural changes between it and DDR4.
The excuse of thin thin thin is no longer a thing, and the requirement of embedded DDR5 SDRAM chips is actually necessary at this point due to signaling and latency, especially at higher speeds and transfer rates.

...then how come I can still stick DDR5 DIMMs in a desktop?

Why would the signalling be different on a laptop or compact AIO desktop? DDDR5 signalling is DDR5 signalling, right?

Is it just that the smaller SODIMM form factor doesn't work?
 
...then how come I can still stick DDR5 DIMMs in a desktop?
They aren't operating at near the speeds that are in the Mac Studio.
DDR5 changed a lot in the architecture of SDRAM, it isn't just a speed boost with lower voltages like each incremental generation before it.

Why would the signalling be different on a laptop or compact AIO desktop? DDDR5 signalling is DDR5 signalling, right?
Because of how DDR5 now functions.
It's the same for LPDDR4X, which is why you always see it embedded and not with discrete modules.

Is it just that the smaller SODIMM form factor doesn't work?
Not at those speeds, correct.
 
How are they not overpriced when their cheapest model costs more than 2x as much as a similar size but much better specd notebook PC by other brands? And the models I'd even consider to be suitable for me start at 2x of what I'd be ever willing to pay. Same for their phones.
But that's not really true. Apple is stingy with RAM and storage, but twice as much? Not really. Take the Dell XPS 13 for example: right now, the base model with a 12th-gen Core i7 (13th-gen is still considered a step up), 8GB of RAM, a 1200p non-touch display and 512GB SSD normally goes for $999. That's less than the current gen MacBook Air with 256GB, but the Air will be faster in some tasks, last longer on battery, boast a higher-resolution display and operate completely silently.
Apple's main weak point right now is desktops. I'd say the Mac mini is a good general productivity machine, but the iMac hasn't been updated in two years. While I'd say the Mac Studio is a fast system, that's only true up to a point; like we've been discussing in this thread, you can do better with Windows PCs if you're willing to throw lots of energy at a task. And I really have to ask why Apple is charging such a stiff premium for the Mac Pro when modularity is the only real advantage over the Studio. If that rumored M2 Extreme had made the cut, maybe...

Apple's phones are priced roughly on par with similar flagships. The catch is just that you can get "good enough" performance from Android phones costing considerably less; a Pixel 7 or Nothing Phone 2 is plenty if you aren't a stickler for camera options and raw speed. The one exception in Apple's lineup is the iPhone SE... yeah, that should've been upgraded to a contemporary design with at least last year's revision, if not 2020.


As for being fragile it doesn't literally means the device shatters like glass, it means the hardware is not designed to be robust, or designed to be easily repairable, in fact it is the exact opposite of that.
Some of it is fairly robust. Not rugged, clearly, but designed to hold up in the real world. And while I'm not a huge fan of the non-upgradeable SSDs, there are signs of change. Apple redesigned the iPhone 14 internals to make them more repairable than before. I understand the MacBook Air M2 also has a few concessions to repairability. This is in no small part due to growing right to repair legislation efforts, but it is happening.

I never had to use their customer service either because I never even had an apple product, but I can read all the horror stories online. That are strangely always about them and rarely the other widespread brands. They say the wise person learns from other people's mistakes, while the fool doesn't learn even from their own.
I'll be frank: you're basing this on selective anecdotes, not thorough evidence. There are plenty of horror stories from other vendors, like the friend who went through multiple Dell Inspiron G15 laptops to get a fixed unit, or terrible batteries on HP laptops, or... these stories just don't garner as much attention, whether it's because they're just not as exciting or they're more expected. Everyone knows that a $500 laptop will involve cutting corners; it's more surprising if a $1,500 laptop flakes out.

Apple still tops the computer chart on the American Customer Satisfaction Index. The gap has closed considerably, thankfully, but the notion that it's somehow worse than most isn't supported by real data. Personally, I've rarely had issues with Apple gear — one MacBook that shipped with bad RAM, one iMac whose PSU wore out after several years... and that's about it. The support was pretty solid, although I won't pretend Apple is immune to the occasional crappy tech or botched repair.
 
New generation learning old things. Maybe itll change things, we all just grew up knowing commercials lied to us and didn't expect anything else. Maybe the next group will push to make it not allowed to lie, could be a good thing. (shrugs).

Wheres that meme of the 486 desktop pc with the sticker "never obsolete" on it? lol
Imagine how boring politics would be if nobody lied.
 
One could point the issue here is more the cherrypicked reviewer than apple (may as well use cherry picked best case scenario up to, if you can get away with it, it was my first reflex.

But after thinking a little bit about it, how many reviews on the X86 side does not include Mac benchmark when it is relevant in their reviews ? And my brain never complain about it.
 
...then how come I can still stick DDR5 DIMMs in a desktop?

Why would the signalling be different on a laptop or compact AIO desktop? DDDR5 signalling is DDR5 signalling, right?

Is it just that the smaller SODIMM form factor doesn't work?
DDR be it 4 or 5 drops significantly in speed when using more than 2 modules, the longer the trace the higher the latency, and even in a 4-module configuration the trace distance between the first and last module is long enough that the speed of the entire bank needs to be decreased to compensate.
This is shown in all AMD and Intel systems currently, 2 modules you can push to DDR5 8000, put in 4 modules to bring that to 128gb in ram and you have to drop that down to DDR5 3200 to deal with the latency the tracing introduces, Level 1 tech did a pretty good breakdown on the problems in a video if you want to dig that up.
This is where the LPDDR5 stuff comes in, it is faster, better, stronger, and blah blah blah, but there is no DIMM or SODIMM standard for it, nobody makes one, so if Apple did it would be proprietary in of itself.
The closest thing to a standard for DDR or LPDDR that would support the speeds, latency, and quantity is the new CAMM format, but that isn't yet finalized, and when it is we can only hope that Apple supports it but let's be real, they won't because the market hasn't forced them too.
 
They could start by not soldering the RAM to the board, and not using proprietary NVME drives that aren't user upgradeable.

I'm sure they could change their ways, but as long as their products are not used maintainable and upgradeable they are not for me under any circumstance.
To elaborate on what others have said: the RAM is built on the SoC itself, and that's key to some of Apple's performance claims. The RAM is pooled between the graphics and CPU with little penalty, which allows for some unique things — say, making up to 192GB of system RAM available for graphics. Removable RAM might negate that advantage.

The NVMe drive restriction is frustrating, to be clear, at least if you're the sort to upgrade storage. I just don't consider it a dealbreaker in my case.
 
But that's not really true. Apple is stingy with RAM and storage, but twice as much? Not really. Take the Dell XPS 13 for example: right now, the base model with a 12th-gen Core i7 (13th-gen is still considered a step up), 8GB of RAM, a 1200p non-touch display and 512GB SSD normally goes for $999. That's less than the current gen MacBook Air with 256GB, but the Air will be faster in some tasks, last longer on battery, boast a higher-resolution display and operate completely silently.
Apple's main weak point right now is desktops. I'd say the Mac mini is a good general productivity machine, but the iMac hasn't been updated in two years. While I'd say the Mac Studio is a fast system, that's only true up to a point; like we've been discussing in this thread, you can do better with Windows PCs if you're willing to throw lots of energy at a task. And I really have to ask why Apple is charging such a stiff premium for the Mac Pro when modularity is the only real advantage over the Studio. If that rumored M2 Extreme had made the cut, maybe...

Apple's phones are priced roughly on par with similar flagships. The catch is just that you can get "good enough" performance from Android phones costing considerably less; a Pixel 7 or Nothing Phone 2 is plenty if you aren't a stickler for camera options and raw speed. The one exception in Apple's lineup is the iPhone SE... yeah, that should've been upgraded to a contemporary design with at least last year's revision, if not 2020.



Some of it is fairly robust. Not rugged, clearly, but designed to hold up in the real world. And while I'm not a huge fan of the non-upgradeable SSDs, there are signs of change. Apple redesigned the iPhone 14 internals to make them more repairable than before. I understand the MacBook Air M2 also has a few concessions to repairability. This is in no small part due to growing right to repair legislation efforts, but it is happening.


I'll be frank: you're basing this on selective anecdotes, not thorough evidence. There are plenty of horror stories from other vendors, like the friend who went through multiple Dell Inspiron G15 laptops to get a fixed unit, or terrible batteries on HP laptops, or... these stories just don't garner as much attention, whether it's because they're just not as exciting or they're more expected. Everyone knows that a $500 laptop will involve cutting corners; it's more surprising if a $1,500 laptop flakes out.

Apple still tops the computer chart on the American Customer Satisfaction Index. The gap has closed considerably, thankfully, but the notion that it's somehow worse than most isn't supported by real data. Personally, I've rarely had issues with Apple gear — one MacBook that shipped with bad RAM, one iMac whose PSU wore out after several years... and that's about it. The support was pretty solid, although I won't pretend Apple is immune to the occasional crappy tech or botched repair.
Apple might not have replaceable storage but they at least nag the everloving crap out of you if you don't have a backup in place, be it time machine or iCloud, but they prefer both really, so while that still results in a huge ass repair/replacement bill should the storage fail before you replace the unit they at least have a series of solid backup options in place for users.
I do always tell people though to get the Apple Care and keep it current for the duration of their ownership, because if you ever need to use it, then it has more than paid for itself.
 
Apple might not have replaceable storage but they at least nag the everloving crap out of you if you don't have a backup in place, be it time machine or iCloud,

Because they want to sell you their overpriced backup unit.
 
Sure, Apple's M1 and M2 are pretty cool, but where do Apples claimed performance numbers come from anyway?

Linus tackles this:



Marketers will, uh, Marketeer I guess.

Not usually a fan of Linus, but I have to admit, this is a decent expose.

He does give Apple high marks for power use though, which is well deserved.

Apple generally needs to come up with a higher TDP part that is more than just two mobile Max parts slapped together for their desktops. The M1 Max and now M2 Max are great in the laptops, but once you're in the desktop realm it just needs to be something bigger/better. Although i'm sure a lot of this is current fabrication limitations. The Mac Pro itself should have something that steps beyond the current M2 Ultra, even if it's as silly as slapping four together. That or at the bare minimum for the Mac Pro they need to allow the architecture to allow external GPU's.
 
Apple generally needs to come up with a higher TDP part that is more than just two mobile Max parts slapped together for their desktops. The M1 Max and now M2 Max are great in the laptops, but once you're in the desktop realm it just needs to be something bigger/better. Although i'm sure a lot of this is current fabrication limitations. The Mac Pro itself should have something that steps beyond the current M2 Ultra, even if it's as silly as slapping four together. That or at the bare minimum for the Mac Pro they need to allow the architecture to allow external GPU's.
External GPUs and their addressing is one of the major failing points of the ARM ISA and it is one of the big features of the Nvidia ARM products, Nvidia has put a lot of work in there making it function.
 
Back
Top