- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
Sitting on my 7900xtx eating popcorn.
Copying over my comment from elsewhere:
The person on reddit used a third party cable instead of the one supplied with the device.
https://www.reddit.com/r/nvidia/comments/1ilhfk0/rtx_5090fe_molten_12vhpwr/
It melted on both sides (PSU and GPU), which indicates it was probably the cable being the issue.
12VHPWR is a fucking mess, so please don’t tempt fate with your expensive purchase.
Interesting that it lasted two years with their 4090 with no ill effects. Even though the 5090 is higher power draw, I would suspect the contact resistance of the cable was causing heating even before, just not enough to smell or deform the connectors.
This is why I don’t feel too strongly about people that buy first-batch hardware. They’re the QC for the rest of us!
Not even first batch as far as this connector goes. This has been an issue ever since this connector was released on the 40 series cards in 2022.
The sensible thing would’ve been to just rollback to the standard 8-pin PCIe power connector that has been reliable for many years. I guess requiring 4 of these for 600 watts would highlight how ridiculous the power draw of the 5090 is.
Instead they made small iterations to this 12VHPWR connector (changing sense pin lengths and other small adjustments) and they’re letting their paying customers test the new iteration with the 50 series.
Admit that 12VHPWR is bullshit and revert to 8-pin. Come up with a working solution or just stick with 8-pin long-term.
Ngl I kinda think anyone who buys a current gen, top of the line GPU is a bit of a chump and has been for over a decade.
They just don’t make that big of a difference.
I mean… I jumped on a 3080FE several years ago near release time because
- Rona time
- it was an absurd amount of generational uplift - as in, I’m not sure we’ll ever see that sort of gen-over-gen performance gain again
They just don’t make that big of a difference.
The 40 series was basically the exact opposite of this. The lower down the stack the worse the per generation gains got. With the lower end cards sometimes seeing regressions because of the lack of memory bandwidth/capacity.
You’d think they’d learn after like the 4th or maybe 5th time. I’ve honestly lost count.
Why not just use a connector that is designed for this much and is already cheap and easily available.
One plug for the GPU one for everything else. I give it 5 years until this is reality lmao
Its fine. Nvidia is introducing these new quick disconnects: