NVIDIA AIC Partners Asked to Collect RTX 4090 Cards Affected by Melted 16-Pin Power Connectors

This is what I ordered, and looking forward to their 90 degree angled cable in the future.
Just did a post on that now.

I think I'm going to order something from them for my 3090 Ti Hybrid just to be on the safe side.
 
Being the smartA** that I am I was already thinking that NV, and partners, in moving forward, ought to be providing fire extinguishers with these cards.

Edit: Just watched it and I wholeheartedly agree with everything Steve says during it. It was already on my radar to look for this but hadn't gotten to it yet. Thanks for posting.
 
Last edited:
Edit: Just watched it and I wholeheartedly agree with everything Steve says during it. It was already on my radar to look for this but hadn't gotten to it yet.
I was discussing with a guy on H that stated he believed that these "lower grade cables" probably shouldn't have made it out to the consumer, but somehow did. I agree with him that is probably what happened, and why not everyone is having an issue with their cards and the NVIDIA adapter.
 
Last edited:
Nvidia needs to jump all over this. I've seen over and over melting plastic on these connectors. The guys at corsair was it. That were calling the youtubers idiots over this have to be eating crow at this stage. I await an official fix before I order any card thst comes with an obviously faulty adaptor.
 
Yes, it means the card connector is not poorly designed. Pretty much every guy that has being investigating the issue, blame the adapter, not the card connector.
Not to be pedantic, but "poorly designed" is not synonymous with "faulty" or "defective". You're correct that it's the adapter that has come under fire (🔥) as the point of failure, but the design of the new 12VHPWR connector has received its share of criticism as well, and rightly so, which is why many were relieved to discover that AMD is sticking with the 8-pin PEG power connectors for its upcoming 7000-series cards.
 
I find that the most shocking thing, so far, is the lack of communication from NV on this to its customers. Letting things come up across forums and YT is poor handling, to say the least. Even this asking of their board partners is only between them and the partners and was not actually communicated to the public. Here's hoping an official statement comes out soon because relying on the snippets of quotes each tech site manages to get isn't helping much of anyone.
 
Not to be pedantic, but "poorly designed" is not synonymous with "faulty" or "defective". You're correct that it's the adapter that has come under fire (🔥) as the point of failure, but the design of the new 12VHPWR connector has received its share of criticism as well, and rightly so, which is why many were relieved to discover that AMD is sticking with the 8-pin PEG power connectors for its upcoming 7000-series cards.
As of today, there is ZERO evidence that the connector itself is badly designed, again the blame universally falls within the adapter.

BTW its evident that AMD didn't have the new connector anyway. Its not like they would switch with only a few weeks from release.
They are just happy to jump on the squid adapter fiasco bandwagon.
 
As of today, there is ZERO evidence that the connector itself is badly designed, again the blame universally falls within the adapter.
The “connector” is only half of a whole. The other half is the “adapter”

While I will admit that you can have bad adapters without having bad connectors — a better design or tighter specifications of the whole could have side stepped the issue entirely (or caused an entirely different set of issues).

My take is that the entire design is poor. That didn’t exactly cause some manufacturers to create bad adapters, but it hasn’t exactly prevented issues or made improvements in the product or installation or operation because of its existence either. It’s been nothing but a net negative to date.
 
The “connector” is only half of a whole. The other half is the “adapter”

While I will admit that you can have bad adapters without having bad connectors — a better design or tighter specifications of the whole could have side stepped the issue entirely (or caused an entirely different set of issues).

My take is that the entire design is poor. That didn’t exactly cause some manufacturers to create bad adapters, but it hasn’t exactly prevented issues or made improvements in the product or installation or operation because of its existence either. It’s been nothing but a net negative to date.
Tnx for your opinion but that's not fact. The fact (at the very least for now, unless there's new evidence) is that the problem is the adapter, not the connector on the video card. No matter what you think, it doesn't change the fact.

When, if the problem surfaces with ATX 3.0 PS then you could argue the connector is bad. To date, there's no evidence to support this.
 
Tnx for your opinion but that's not fact. The fact (at the very least for now, unless there's new evidence) is that the problem is the adapter, not the connector on the video card. No matter what you think, it doesn't change the fact.

When, if the problem surfaces with ATX 3.0 PS then you could argue the connector is bad. To date, there's no evidence to support this.
Why does there have to be a problem with the Power Supply for it to be a crappy connector?

I also, my opinion, think plenty of other connectors are bad. Micro-USB drives me nuts, just for instance, and I've damaged at least a dozen devices by managing to cram it in there backwards not paying enough attention. While I would say those devices were, by and large, crap - if they had used a different connector that wouldn't have been an issue in the first place.

Same thing here.

Other bad connectors:
Screwless Terminal blocks - these things make me want to pull out my hair
9V battery terminals that aren't on a rigid PCB - those things are bastards to get apart
Any round cable with an internal key (such as cheap S-Video cables and some PS-2 connectors) -- good cables usually have some external indicator, but not all did
Berg Connector -- it had an external key (2 actually), but many devices implemented it so poorly it was easily inserted backwards.
DisplayPort - the locking mechanism is nice, but holy cow it can be difficult to get off when the cable is laying against a monitor panel

RJ45 gets an honorable mention - too often the locking clasp gets twerked and won't actually lock any more - can't count how many networking issues have boiled down to just that right there. And when they have the boot on them, they can be bastards to get out when they are all crammed in a row in a switch or something.
 
Last edited:
Just putting this here for those interested and keeping up on it. Not enough for a whole story on it and not enough info to clarify how it happened but still, probably not a good thing either. This could be the 1st ATX 3.0 16-pin cable (not adapter) I've heard of this happening to. I just checked NVIDIA's newsroom, blog, and Twitter, and still nada on any of it.

 
Just putting this here for those interested and keeping up on it. Not enough for a whole story on it and not enough info to clarify how it happened but still, probably not a good thing either. This could be the 1st ATX 3.0 16-pin cable (not adapter) I've heard of this happening to. I just checked NVIDIA's newsroom, blog, and Twitter, and still nada on any of it.

I wouldn't be surprised if this is true and more showed up. The 9/9.5amp rating seems to be being met during normal use. That 9/9.5 amp rating, should never be that high if meeting it would have it burn down becuase of a little bend, or not seated absolutely perfect or a little construction variation or some such. Thats absurd. Those plug ratings, however they are arrived to, have never been challenged to their limit before hence, their 'rating' not tested in a meaningfulway. Honestly ~100w is a ton for such seemingly a small pins. These connections in whatver form, adapter or not, are not over provisioned enough. The rating needs to be 60% of what they think it is.
 
Just putting this here for those interested and keeping up on it. Not enough for a whole story on it and not enough info to clarify how it happened but still, probably not a good thing either. This could be the 1st ATX 3.0 16-pin cable (not adapter) I've heard of this happening to. I just checked NVIDIA's newsroom, blog, and Twitter, and still nada on any of it.

That changes everything for me, if more reports like this come out.

Gotta say I've seen my fair share of melt connectors, be it molex, sata power, ATX and PCIe, so I was willing to give nvidia the benefit of the doubt.

At the very least this might mean the connectors needs to meet tighter specs.
 
Last edited:
At the very least this might mean the spec needs to meet tighter specs.
I've said in other replies that I'm not remotely an expert on it but my gut feels like this is pushing that type of connector and cabling to a limit they shouldn't have to go to. It's just too small in my opinion. I have it on my 3090 TI's but thankfully they use quite a bit less power so I'm not overly concerned but was hoping to upgrade to whatever succeeds the 4090 later on but that's not going to happen if this doesn't get straightened out.
 
I have it on my 3090 TI's but thankfully they use quite a bit less power
Hmm do they?

TDP is listed as 450 for the 3090Ti, 4090 is listed as the same.

Now, there is a difference between published TDP and what you see real world, and of course overclocks throw all that out the window. But at least on paper it looks like they are about the same, without having one of each to look at power draw side by side.

Here's as close as I can come, from your reviews - looks like 4090 is higher, but only by a small margin - 35W or so - on just the chip, but total board power is almost identical and that's what is going to matter on the power supply draw. Straw met Camel, or something else?

Also, shouldn't these cards be able to get up to 75W through the PCI connector itself? CPU-Z probably can't tell where the power is coming from specifically, just total power used.


1667581009472.png

 

Attachments

  • 1667580962103.png
    1667580962103.png
    237.4 KB · Views: 0
Hmm do they?

TDP is listed as 450 for the 3090Ti, 4090 is listed as the same.

Now, there is a difference between published TDP and what you see real world, and of course overclocks throw all that out the window. But at least on paper it looks like they are about the same, without having one of each to look at power draw side by side.

Here's as close as I can come, from your reviews - looks like 4090 is higher, but only by a small margin - 35W or so - on just the chip, but total board power is almost identical and that's what is going to matter on the power supply draw. Straw met Camel, or something else?

Also, shouldn't these cards be able to get up to 75W through the PCI connector itself? CPU-Z probably can't tell where the power is coming from specifically, just total power used.


View attachment 2018

[/URL]
It's also why I'm keeping an eye on this. I know it's not hugely lower and that there's been some debate on how much on other sites as well. I'm not sure if the adapter that EVGA provided with its 3090 Ti's is the exact same as those given out for the 4090s either. What I can tell you about the ones I got is that they're 12-pin (don't have the 'sense' pins) 3x 8-pin, as opposed to 16-pin 3 or4 x 8-pin.

Yep, 75W at the slot is still the standard for PCIe 4.0. At one time it was erroneously stated it would go to 300W but SIG said no to that. I don't really know about 5.0 but that's not really relevant for the vast majority of us right now.

https://www.techspot.com/news/66108-pcie-4-wont-support-300-watts-slot-power.html
 
Just putting this here for those interested and keeping up on it. Not enough for a whole story on it and not enough info to clarify how it happened but still, probably not a good thing either. This could be the 1st ATX 3.0 16-pin cable (not adapter) I've heard of this happening to. I just checked NVIDIA's newsroom, blog, and Twitter, and still nada on any of it.

I still believe that some of this may be to user error, and I'm not referring to just the bending of the cable theory. When I got my card I thought the plug was in all the way on the card when it wasn't. The system powered on and the card worked fine. After I decided to look things over again after a short run I then noticed the connector was not in all the way, and when I pushed it in I heard a very faint click. The tab is on the bottom of the connector unlike the normal 8 pin PCIE cables, so it could easily be missed. Not sticking up for Nvidia either just pointing out what may be user error.
 
I still believe that some of this may be to user error, and I'm not referring to just the bending of the cable theory. When I got my card I thought the plug was in all the way on the card when it wasn't. The system powered on and the card worked fine. After I decided to look things over again after a short run I then noticed the connector was not in all the way, and when I pushed it in I heard a very faint click. The tab is on the bottom of the connector unlike the normal 8 pin PCIE cables, so it could easily be missed. Not sticking up for Nvidia either just pointing out what may be user error.
I agree. The click on mine was a bit too soft for my tastes as well and I remember trying it multiple times until I felt confident with it.
 
It's also why I'm keeping an eye on this. I know it's not hugely lower and that there's been some debate on how much on other sites as well. I'm not sure if the adapter that EVGA provided with its 3090 Ti's is the exact same as those given out for the 4090s either. What I can tell you about the ones I got is that they're 12-pin (don't have the 'sense' pins) 3x 8-pin, as opposed to 16-pin 3 or4 x 8-pin.

Yep, 75W at the slot is still the standard for PCIe 4.0. At one time it was erroneously stated it would go to 300W but SIG said no to that. I don't really know about 5.0 but that's not really relevant for the vast majority of us right now.

https://www.techspot.com/news/66108-pcie-4-wont-support-300-watts-slot-power.html
The PCI spec is backwards compatible, so PCI 5 would have to be at least 75W - it could be more (I kinda doubt it) but it couldn't be any less.
 
Become a Patron!
Back
Top