PlayStation 5 GDDR6 memory reaches potentially critical temperatures due to seemingly inadequate PS5 cooling solution design

I don't know who else watched the whole GN video, but they basically used a cheap steel plate (not even aluminum) with minimal contact to the RAM chips, some of where are nowhere near the cooling plate. They should have added another heatpipe for the RAM, or made the one cooling the VRMs larger or something. Really just seems like a lazy and cheap design.
Dang, that's really lousy.
Between that and the software issues they have been having, the PS5 definitely has some kinks that Sony needs to iron out.

Strip20201123a.x44380.png


Nice, a dedicated FUD console thread for DukenukemX to shit on consoles some more. Thanks, Erek. 😐
Ehh, I don't always agree with DukenukemX, but he does have a right to voice his opinion just as much as anyone else on here, and he isn't always wrong.
Also, Erek does an exceptional job of finding fun and entertaining tech articles, and this one does present and interesting and legitimate issue with the PS5 that could lead to potential failures or increased wear and longevity deprivations.

Looks like the trusty Jaguar CPUs in my thin client and PS4 Pro are going to be in service for a little while longer. (y)
 
Last edited:
Fine is subjective. It'll work fine for now, but for how long? Running components that hot will inevitably fail, just like the Xbox 360 RROD.

Except the RROD was not caused by heat, it was a byproduct of the rapid industry switch to lead-free solders that had no ductility and mechanically failed. Heat definitely exacerbated the issue, but it was not the primary cause of the failure. Incompetent and uneducated EU lawmakers are to blame, they wrote law based on "heavy metals are bad we should ban them" rather than "why is lead used over other metals and can it be effectively replaced with a safe alternative". The answer to the latter question is "no" as we've found out with the extremely high failure rates of all ranges of electronic products from 2006 to around 2012 when engineers and metallurgists came up with alloys that have better ductile properties. The better lead free solders today are still not anywhere near as good as lead based solders. Lead is a soft and ductile material which maintains its ductility for the lifetime of the product, unlike more modern solders which tend to get work hardened.
 
Except the RROD was not caused by heat, it was a byproduct of the rapid industry switch to lead-free solders that had no ductility and mechanically failed. Heat definitely exacerbated the issue, but it was not the primary cause of the failure.
I don't disagree with you on this but heat was a factor. Better cooling would have helped. What would have also helped would have been extensive testing, which is something nobody does anymore it seems.
Incompetent and uneducated EU lawmakers are to blame, they wrote law based on "heavy metals are bad we should ban them" rather than "why is lead used over other metals and can it be effectively replaced with a safe alternative". The answer to the latter question is "no" as we've found out with the extremely high failure rates of all ranges of electronic products from 2006 to around 2012 when engineers and metallurgists came up with alloys that have better ductile properties. The better lead free solders today are still not anywhere near as good as lead based solders. Lead is a soft and ductile material which maintains its ductility for the lifetime of the product, unlike more modern solders which tend to get work hardened.
There's a lot of dead HP and Apple laptops as a result of the lead free solder. Personally, I'd rather not have lead in my electronics for health reasons. They seemed to have found a fix considering how hot modern electronics get. While the solder may not fail, the silicon in the chips may.
 
I don't disagree with you on this but heat was a factor. Better cooling would have helped. What would have also helped would have been extensive testing, which is something nobody does anymore it seems.

Heat is not so much an issue as thermal cycling. The lead free solder lacked ductility so temperature swings would cause the BGA balls to shear from metal fatigue. BGA packages have always been problematic because uneven heating of the IC will lead to extreme and uneven expansion/contraction, which can quite literally be tons of force that causes the joint to shear somewhere between the PCB and IC. The narrower the window of thermals the BGA package experiences, the longer it is going to last. So if you have a PC which is never turned off vs a PC which is power cycled hundreds of times, the former machine will likely not experience as many failures.

It'd be nice if GPUs went back to QFP, but it's not really feasible with the massive number of connections required on modern chips. QFP is the best of large chip connections since the whole IC package floats on hundreds of wires that can freely move around.

There's a lot of dead HP and Apple laptops as a result of the lead free solder. Personally, I'd rather not have lead in my electronics for health reasons. They seemed to have found a fix considering how hot modern electronics get. While the solder may not fail, the silicon in the chips may.

If you're referring to the laptops with the Nvidia 8x00/9x00M GPUs, that was yet another issue. They still had the BGA cracking issue, but there was a further issue where the bond wires from the ASIC die would detach from the solder pads on the GPU package. No amount of reflowing will fix these, you can even remove, reball and reinstall the GPU package and it will still be dead.
 
This seems like an oversight or something to me. Why would you actively cool some but not all? This is another reason why I love the Xbox Series X design.

I’m happy Sony used Liquid Metal and took a better approach but it’s still not good enough. They could have had cutouts in the shroud and used 1 cent aluminum memory heat sinks with .5mm thermal pads. Or took a more expensive route and run that heat pipe around the other way too hit those.

It’s still within spec but I don’t know why you cool some but not all. Seems short sighted to me.
 
This is the kind of thing why you never buy the first version of any new console generation. Let the labrats test them out first, and I'll buy the improved version for cheaper when there will be more than 2 games available for it.
 
Nice, a dedicated FUD console thread for DukenukemX to shit on consoles some more. Thanks, Erek. 😐

Maybe create or come back to the thread when and IF this poses any real widespread issues with the console. Otherwise, I think it's a good assumption that Sony engineers have accounted for this and tested the console in reasonable ambient temperature ranges. At least it's certainly more believable that they did than didn't.
Like they tested the Xbox 360 (RROD) and PS3 (Yellow-thing)?
 
This is the kind of thing why you never buy the first version of any new console generation. Let the labrats test them out first, and I'll buy the improved version for cheaper when there will be more than 2 games available for it.
Eh, I'd say people are overstating the concerns here. I know anecdotes aren't everything, but I haven't had any overheating issues with my PS5 despite sitting in a mostly-enclosed cabinet (only the back is open).

If there's been any teething hardware issues, it's that Sony briefly introduced a weird glitch where the front USB ports wouldn't charge the controller. That was fixed in an update, and it's been smooth sailing ever since.
 
Eh, I'd say people are overstating the concerns here. I know anecdotes aren't everything, but I haven't had any overheating issues with my PS5 despite sitting in a mostly-enclosed cabinet (only the back is open).

If there's been any teething hardware issues, it's that Sony briefly introduced a weird glitch where the front USB ports wouldn't charge the controller. That was fixed in an update, and it's been smooth sailing ever since.
Excessive but within specs heat usually doesn't cause problems immediately. I'd wait whether they start to die in a few months. There are lot of parts that don't like 90+ C°
 
Considering that it opening up a PS5 doesn't seem that terrible, is there a way to just stick some really thick thermal pads over those DDR modules so they make contact with the giant heatspreader? Sadly due to the way that heatsink is made I don't think that just slapping thermal adhesive and RAM heatsinks will work... =(
 
Considering that it opening up a PS5 doesn't seem that terrible, is there a way to just stick some really thick thermal pads over those DDR modules so they make contact with the giant heatspreader? Sadly due to the way that heatsink is made I don't think that just slapping thermal adhesive and RAM heatsinks will work... =(
You probably could, and that would reduce the temperature. Lots of high quality thermal pads that would easily outperform what Sony did. If I were a PS5 owner I would probably do that ASAP.
 
Considering that it opening up a PS5 doesn't seem that terrible, is there a way to just stick some really thick thermal pads over those DDR modules so they make contact with the giant heatspreader? Sadly due to the way that heatsink is made I don't think that just slapping thermal adhesive and RAM heatsinks will work... =(
Thought they already have thermal pads for those chips?
 
All these YouTube videos. Have any of these streamers gotten a comment from Sony on it or is everyone just speculating and not trying to do actual reporting?
 
All these YouTube videos. Have any of these streamers gotten a comment from Sony on it or is everyone just speculating and not trying to do actual reporting?
Just speculations. With how seriously Sony and MS is taking cooling on the new consoles I am sure they feel everything they did is more then fine. Neither Sony or MS wants another RRoD fiasco.
 
Basically, after all that talking:
"Probably doesn't matter."
That's been my view. Not to completely rule out possibilities, but there are a lot of armchair engineers who like to say they've discovered a flaw but are really speculating or operating on incomplete info. And let's face it, it's easy to give into the sensationalism, especially if you're anti-console or anti-Sony.
 
Fine is subjective. It'll work fine for now, but for how long? Running components that hot will inevitably fail, just like the Xbox 360 RROD.
negative. 360 RROD was chip design failure on the part of MS. This is beyond well documented. they did not add the needed metal layer and people tried to blame ATi (AMD)
 
I'm aware of those. Now what do you think the percentage of those affected were of the total consoles sold and didn't exhibit those issues? It's probably not a citable figure, but I'd wager less than 5%, esp in the PS2's case given its the best selling console ever.
You pulled 5% out of your ass. Can you find actual statistics on the failure rates, from which a sound conclusion can be drawn rather than speculation?
I don't know who else watched the whole GN video, but they basically used a cheap steel plate (not even aluminum) with minimal contact to the RAM chips, some of where are nowhere near the cooling plate. They should have added another heatpipe for the RAM, or made the one cooling the VRMs larger or something. Really just seems like a lazy and cheap design.
Watch out, Sony fans in the thread.

Time will tell.
 
sounds like FUD to me. Anyone in the field(owners) experiencing issues?
 
so FUD. no reasonable basis.

Not completely without basis. We do know that silicon chips do not like excessively high temps in the long run and it has a direct effect on their lifetime. 95 degrees is already very bad and it is only getting worse after a year of dust build-up and a hot summer on top. Maybe they will last, maybe they won't but this is simply a bad design and people are right to be worried.
 
Back
Top