Nvidia RTX 4090 power connectors melting?

The first time I saw that garbage connector, I knew it was going to cause problems. Of course pushing more power through fewer, MUCH smaller pins is going to be an unmitigated disaster. They should have stuck with 3 or 4 8 pin PCIe power connectors.

The industry's obsession with making ever smaller connectors and cranking vastly more amounts of power through them has been causing problems since USB-C and Lightning were introduced. Whichever engineer thought it was a great idea to have multiple voltages and up to 100W of power on a pin almost as thin as hair, doesn't need to have a job engineering.
 
I can only imagine how many problems these is going to cause, even at the system builder level. This really isn't safe by any measure.
 
The first time I saw that garbage connector, I knew it was going to cause problems. Of course pushing more power through fewer, MUCH smaller pins is going to be an unmitigated disaster. They should have stuck with 3 or 4 8 pin PCIe power connectors.

The industry's obsession with making ever smaller connectors and cranking vastly more amounts of power through them has been causing problems since USB-C and Lightning were introduced. Whichever engineer thought it was a great idea to have multiple voltages and up to 100W of power on a pin almost as thin as hair, doesn't need to have a job engineering.
I don't know, I think they can make something better. Having to plug 4 8 pin connectors into the GPU sucks.
 
This style of connector was used with the 3090 FE, 3090 ti FE and AIB 3090 ti models with no issues. I own a 3090 ti and a 4090 and they both pull about the same amount of watts while gaming. I don't think we have enough information to clearly say that this new 16-pin 12VHPWR power connector is at fault. The one thing that is clear is that the physical size of these 4090 cards are causing end users to bend the power cable at 90 degree angles so it fits in there computer cases.
 
Last edited:
This style of connector was used with the 3090 FE, 3090 ti FE and AIB 3090 ti models with no issues. I own a 3090 ti and a 4090 and they both pull about the same amount of watts while gaming. I don't think we have enough information to clearly say that new 16-pin 12VHPWR power connector is at fault. The one thing that is clear is that the physical size of these 4090 cards are causing end users to bend the power cable at 90 degree angles so it fits in there computer cases.
I agree. Installed properly they should be fine. It's when you try to cram too much shit into a confined space that problems begin to happen. Buy a bigger case, buy an ATX 3.0 PSU, or just leave the side panel off appears to be the answer for those that want to bend the crap out of the cables. Would also have been nice if NV wouldn't have designed these cards to have big honking 600w HSFs.

I've pulled the side off my case and plan on buying a wider case so I don't have to bend the cables too much. Will be purchasing a new PSU when decent ones become available.
 
I don't know, I think they can make something better. Having to plug 4 8 pin connectors into the GPU sucks.
They could of course do something different, but they can't ignore physics.

If they want to crank more power through smaller pins, they need to bump the voltage up. If they would have introduced the newer connector, while also introducing a new power rail to the ATX 3.0 standard, like 24v or 48v, they could have been just fine with the smaller connector. PoE+ uses 48v on twisted pair, and it can crank a ton of power with the only real enemy being liquid ingress.
 
They could of course do something different, but they can't ignore physics.

If they want to crank more power through smaller pins, they need to bump the voltage up. If they would have introduced the newer connector, while also introducing a new power rail to the ATX 3.0 standard, like 24v or 48v, they could have been just fine with the smaller connector. PoE+ uses 48v on twisted pair, and it can crank a ton of power with the only real enemy being liquid ingress.
Wasn't saying they can ignore physics, just that it's sad this is the best we got when we know they could've made something better.
 
I don't know, I think they can make something better. Having to plug 4 8 pin connectors into the GPU sucks.
I agree they can make something better, although plugging in 4 connectors is hardly an inconvenience compared to the alternative...

Take for instance the RC hobby world. The XT60 connectors are VERY robust and can handle much more than 30 connection cycles and are pretty small relative to their current carrying capacity and very much can handle wires bending as the connection is very secure. They are also not even that hard to plug and unplug. Can handle 60amps continuous (180amps peak), 12x60 = 720 watts on one connector. Add another one for safety margin and bam you got yourself a bullet proof connection for a 4090. I would make the XT60 connectors black otherwise it will look funny having a yellow connector. Putting XT60s on PC parts can't be that expensive for manufacturers can they?
 
This style of connector was used with the 3090 FE, 3090 ti FE and AIB 3090 ti models with no issues. I own a 3090 ti and a 4090 and they both pull about the same amount of watts while gaming. I don't think we have enough information to clearly say that new 16-pin 12VHPWR power connector is at fault. The one thing that is clear is that the physical size of these 4090 cards are causing end users to bend the power cable at 90 degree angles so it fits in there computer cases.
The 12-pin on Ampere was designed by NVIDIA. The adapter that NVIDIA is including with Lovelace was made to PCI-SIG specifications. Seems like PCI-SIG should have been consulting NVIDIA on the connector design for 12VHPWR.
I agree they can make something better, although plugging in 4 connectors is hardly an inconvenience compared to the alternative...

Take for instance the RC hobby world. The XT60 connectors are VERY robust and can handle much more than 30 connection cycles and are pretty small relative to their current carrying capacity and very much can handle wires bending as the connection is very secure. They are also not even that hard to plug and unplug. Can handle 60amps continuous (180amps peak), 12x60 = 720 watts on one connector. Add another one for safety margin and bam you got yourself a bullet proof connection for a 4090. I would make the XT60 connectors black otherwise it will look funny having a yellow connector. Putting XT60s on PC parts can't be that expensive for manufacturers can they?
XT60 also uses 12AWG wire since it carries all 60A over one wire. 12AWG isn't very flexible and wouldn't be practical in current ATX PC case design.
 
I can only imagine how many problems these is going to cause, even at the system builder level. This really isn't safe by any measure.
OH man can you imagine some of the prebuilt F ups. This is going to end up shutting down some smaller crap builders... which I guess isn't a bad thing. I just hope they don't kill anyone in resulting house fires.
 
The 12-pin on Ampere was designed by NVIDIA. The adapter that NVIDIA is including with Lovelace was made to PCI-SIG specifications. Seems like PCI-SIG should have been consulting NVIDIA on the connector design for 12VHPWR.

XT60 also uses 12AWG wire since it carries all 60A over one wire. 12AWG isn't very flexible and wouldn't be practical in current ATX PC case design.
IT is Nvidias design. PCI SIG didn't throw NVs design out and make their own.
The difference is simple... Amepere didn't push over 450 watts. The new cards are being pushed by people over clocking up to 600 watts. At the max spec of the cable even a minor tilt or bend is adding a ton of resistance. At 450 watts probably no problem... at 500-600 watts with a little bit of overclocking having even a minor bit of resistance is turning out a ton of heat.

This is PCI SIGs fault sure... they should have probably laughed at Nvidias push to make this connector a standard. Money talks.
 
IT is Nvidias design. PCI SIG didn't throw NVs design out and make their own.
The difference is simple... Amepere didn't push over 450 watts. The new cards are being pushed by people over clocking up to 600 watts. At the max spec of the cable even a minor tilt or bend is adding a ton of resistance. At 450 watts probably no problem... at 500-600 watts with a little bit of overclocking having even a minor bit of resistance is turning out a ton of heat.

This is PCI SIGs fault sure... they should have probably laughed at Nvidias push to make this connector a standard. Money talks.
Source on NVIDIA pushing the connector as a standard?
 
  • Like
Reactions: DPI
like this
Source on NVIDIA pushing the connector as a standard?
https://www.tomshardware.com/news/n...cards-may-use-new-12-pin-pcie-power-connector
"Our own insider source has confirmed that the connector is indeed real and has been submitted to the PCI-SIG standards body. However, it remains to be seen whether it will pass approval."

This isn't some secret. Its an Nvidia cable... they submitted it to PCI SIG for approval a couple years back already. Its 100% Nvidia part adopted by PCI SIG.
 
The 12-pin on Ampere was designed by NVIDIA. The adapter that NVIDIA is including with Lovelace was made to PCI-SIG specifications. Seems like PCI-SIG should have been consulting NVIDIA on the connector design for 12VHPWR.

XT60 also uses 12AWG wire since it carries all 60A over one wire. 12AWG isn't very flexible and wouldn't be practical in current ATX PC case design.
Oh they are plenty flexible if the individual wire strands are thinner in the wires, but would require a few changes for PSU manufacturers.
Or keep the same 18awg wires but combine them all at the connector lol. (Jk)

Seriously though, other people have already done the hard work of designing better connectors, would it hurt so much to adopt the those same connectors (like XT60)?
 
It's not the connectors fault it's the spikes! They forgot to spec it under full transient load! Right?
 
It's not the connectors fault it's the spikes! They forgot to spec it under full transient load! Right?
No it’s the connector, sort of… they designed a connector that has a usage requirement beyond “plug it in”. As a consumer device that has a strict bend radius requirement they should have designed it in a way that you can not violate that requirement.

They should have designed it as a right angle and encased the bend in hard plastic. Then no amount of reasonable force would break the tolerances.
 
https://www.tomshardware.com/news/n...cards-may-use-new-12-pin-pcie-power-connector
"Our own insider source has confirmed that the connector is indeed real and has been submitted to the PCI-SIG standards body. However, it remains to be seen whether it will pass approval."

This isn't some secret. Its an Nvidia cable... they submitted it to PCI SIG for approval a couple years back already. Its 100% Nvidia part adopted by PCI SIG.
"Our own insider source."

So it's just a rumor.
 
No it’s the connector, sort of… they designed a connector that has a usage requirement beyond “plug it in”. As a consumer device that has a strict bend radius requirement they should have designed it in a way that you can not violate that requirement.

They should have designed it as a right angle and encased the bend in hard plastic. Then no amount of reasonable force would break the tolerances.
Yet again the average user is being underestimated...
 
No it’s the connector, sort of… they designed a connector that has a usage requirement beyond “plug it in”. As a consumer device that has a strict bend radius requirement they should have designed it in a way that you can not violate that requirement.

They should have designed it as a right angle and encased the bend in hard plastic. Then no amount of reasonable force would break the tolerances.
I couldn't agree more.
Cablemod has already made it but theirs is on pre order only after the 31st.
https://store.cablemod.com/cablemod-12vhpwr-right-angle-adapter/

Nvidia's connector is just ignorant. Knowing that the vast majority of people are going to hamfist the installation/cabling and aren't going to rtfm. They have created a problem that didn't need to be.

Edit- apparently there's nothing in the manual that tells users how to properly use the cable/connector.
 
Last edited:
No it’s the connector, sort of… they designed a connector that has a usage requirement beyond “plug it in”. As a consumer device that has a strict bend radius requirement they should have designed it in a way that you can not violate that requirement.

They should have designed it as a right angle and encased the bend in hard plastic. Then no amount of reasonable force would break the tolerances.
One of the first things I thought when this broke was "right-angle connector would solve the bend issue" and sure enough, Tom's Hardware has a link already:

https://www.tomshardware.com/news/right-angle-16-pin-connector-may-save-a-lot-of-rtx-4090-gpus
 
No it’s the connector, sort of… they designed a connector that has a usage requirement beyond “plug it in”. As a consumer device that has a strict bend radius requirement they should have designed it in a way that you can not violate that requirement.

They should have designed it as a right angle and encased the bend in hard plastic. Then no amount of reasonable force would break the tolerances.
So its the new Apple, except instead of "You aren't holding it right!", it is now "You aren't bending it right!"
 
One of the first things I thought when this broke was "right-angle connector would solve the bend issue" and sure enough, Tom's Hardware has a link already:

https://www.tomshardware.com/news/right-angle-16-pin-connector-may-save-a-lot-of-rtx-4090-gpus

That should solve this for the most part. It's still a lot of power for that single connector. It's not like the design is inadequate per se, but there's not a lot of fault tolerance when you push a connector right to its capacity. I don't think this one is nVidia's fault aside from the girth of the 4090 practically pushing the port up against the case wall making the connector unable to fit in the gap. The port should have been on and angle or somewhere else on the card.

Never mind that there will be cheap cables around that are utterly incapable of delivering 600W under any circumstances, not that we haven't always seen that.

600W is a lot for such a petite set of contacts.
 
https://www.tomshardware.com/news/n...cards-may-use-new-12-pin-pcie-power-connector
"Our own insider source has confirmed that the connector is indeed real and has been submitted to the PCI-SIG standards body. However, it remains to be seen whether it will pass approval."

This isn't some secret. Its an Nvidia cable... they submitted it to PCI SIG for approval a couple years back already. Its 100% Nvidia part adopted by PCI SIG.
That's a different connector. Ampere's isn't the same thing.
 
"Our own insider source."

So it's just a rumor.
Give me a break... ya its a rumor. The cable Nvidia uses the SAME molex plug on... and is wired EXACTLY the same. So much so that 3090 owners can just take their "pre SIG" Nvidia cards and plug in a ATX 3.0 Sig cable and be fine... its totally NOT the same cable. lmao

Don't be silly. Its Nvidias cable... they sent it to SIG for cert almost 3 years ago.
 
The issue is with the 12VHPWR connector, not the adapter. What makes you think that the situation would be any better with a native ATX 3.0 PSU?
It won't have all those thick cables attached to the plug like you get with the adapter. From pictures I've seen the cable looks more flexible. It will hopefully not put the same strain on the plug.
1666807185332.png
1666807194103.png
 
The issue is with the 12VHPWR connector, not the adapter. What makes you think that the situation would be any better with a native ATX 3.0 PSU?
Hopefully, quality PSU makers will take the bend radius into consideration when manufacturing their native cables/connectors so that it is not left up to the user to be super mindful of their installation.
 
That's a different connector. Ampere's isn't the same thing.
Is it just this minus the sense pins or really actually different?

The other difference is my 3080 Ti FE only has two 8-pins adapting to the 12-pin. On top of that, it's also flipped vertically and angled so there is literally no sharp bends needed in the cable. Odd to me that they didn't just keep that same orientation.
 
Last edited:
The issue is with the 12VHPWR connector, not the adapter. What makes you think that the situation would be any better with a native ATX 3.0 PSU?
From the pictures I've seen it's always the build quality of the connector. It doesn't matter which is used. The connector separating or pins pulling out seem to be the biggest problem. The three bequiet! PSU connector failures were all due to the connector separating (2) or pins pulled out (1). They were all linked from a reddit post. Those are the only PSU connector failures I've seen but tbh I didn't know bequiet! had even started selling their atx 3.0 units yet.
 
Kinda a odd issue. A decent 8 pin(6+2) can push 300w all over 3 12v pins. Nvidia could have made a 12 pin of the same form factor that could do an excess of 600w. I'm also tired of seeing 4 6+2 going into these harness. Its unessisary, if you're willing to push a small custom connector close to its limit no reason to be so ridiculously cautious on the other end of the connector especially when all the 6+2 are normally tied to the same 12v rail.

If you are making a custom connector use copper tabs like server nodes. That can push over 1kw reliably and could be fed with just 2 high strand count heavy gauge wires. Then sell adapters over a foot or so letting you hide the 6+2 pin connector stupidness.
 
Back
Top