You are so incorrect its not even funny. 600w is the TOTAL power. For the NVIDIA PATENTED 12VHPWR which literally came out with the 3000 series, which has SIX 12v rails/wires and 6 ground rails/wires, that would mean 600/6=100w per wire. Which divided by 12 volts would mean 8.34 amps per wire. The "spec" according to Nvidia is good up to about 9 amps or slightly higher. Turn around and look at a single pci-e 8 pin connection. It has 3 power and 5 ground wires. Guess what? put TWO 8pin together which is what current adaptors do, and you now have 6 power wires and 10 grounds wires. That's more than the 12VHPWR cable standard in terms of grounds, and same amount of connections for power.... both specifications are using 18 gauge wire. PSU's use 18 gauge and the Nvidia 12VHPWR standard uses 18 gauge. There is literally no DIFFERENCE.
Now go look at power charts. THREE FEET of 18 gauge wire is good for up to 15 amps of power. 15 amps times 12 volts equals 180 watts. That means a SINGLE wire in the 8 pin pci-e psu standard is capable of 180 watts on the "high" end. And with THREE total power wires, that would mean 540 watts per single 8 pin cable.... EVEN IF WE PLAY CONSERVATIVE to be save, you could push 10 amps through that wire safely. 10 amps times 12 volts is 120 watts per wire. Three power/positive wires in a 8pin connection means 360 watts per cable. TWO cables would mean 720 watts for two cables.
What people DONT realize is current GPU's generally have 2-3 8pin or combo 8pin+6pin because ONE is being used to drive the GPU chip itself and the other is being used to handle everything else on the board. Higher end cards have 3 for redundancy as many times they will tell you "you can use just two of the three and still get full performance."
So lets use some common sense here. IN REALITY the 12VHPWR NVIDIA PATENTED standard has been around. Its already out with 3000 series founders cards. Magically its an issue? YES I understand that current 3090's are rated 350w and can use up to 450w, which is a little shy of 600w of the 12VHPWR standard.... But to claim newer gpu's are going to ruin those adaptors is bullshit.
NVIDIA COULD NOT get power supply companies to adopt their 12VHPWR standard. Because it had "nvidia" written on it. EVERYONE AND THEIR MOTHER, and even this very website, when talking about the 12VHPWR standard back before the 3000 series launched, cried saying "why do we need a new power standard, this is stupid, I am not upgrading my PSU simply because Nvidia is using the 12VHPWR on their founders edition cards, people can just buy AIB cards with typical power connections...."
EVGA dropped Nvidia.... I bet you ONE of the MANY reasons, was Nvidia is trying to force AIBS to adopt the 12VHPWR standard. IF (and its a big if) we see AIBs using the 12VHPWR connection, then we will know for a fact that is ONE of the reasons EVGA pushed back against Nvidia. With basically only 1 to 5 PSU's in total of the 1000's you can buy having 12VHPWR connections natively, now being called "pci-e 5.0", buying a 4000 series card will be harder because now you have to buy the gpu AND a new psu.... especially with news articles like this one above lying to people saying their cable adapters gonna start fires.....
nvidia is using TWO marketing tactics. The first, scare tactics. "your adapter will catch fire" and secondly "marketing lies" by claiming the 12VHPWR is a pci-e 5.0 standard, when in reality it has NOTHING to do with pci-e 5.0.... AMD is being caught up in this news bullshit claiming their next gen gpu's wont be pushing a lot of power because the pci-e 5.0 cable standard isn't readily available yet, and they want to wait to push power usage until the pci-e 5.0 cables are more mainstream. WRONG. anyone using the NVIDIA PATENTED cable standard has to PAY Nvidia to use it. That means PSU brands have to pay nvidia, and then AMD would have to pay NVidia to use it.... Reminds me of G-sync, its literally no different than adaptive sync or freesync, but they force you to buy Nvidia branded gpu's to use it. And not only do users have to buy Nvidia gpu's to use full g-sync, but monitor makers have to PAY nvidia to put their "chip" in the monitor to run g-sync.... insanity. this is all about Nvidia taking over the industry. And I guarantee EVGA was the first to push back.
I mean imagine, NVidia will only let you use ONE 12VHPWR connection, meaning you are limited to 600w. Meanwhile you could have used THREE 8pin pci-e connection that we already have, and allow anywhere between 1080 and 1620 total watts (using the same math I wrote above, just multiplying one cable to 3 cables total).
THE CLAIM that a single 8pin connection is only good for 150w, is pure insanity. MORE THAN LIKELY its the BACK END of the PSU that is limiting a how much power can be drawn from a single wire and thusly the cable as a whole. 150w divided by 3 wires is 50w per wire. 50w divided by 12volts = 4.2 amps (round up obviously, 4.16666667). That isn't even 5 amps per wire. We KNOW FOR A FACT that 18 gauge wire, is capable of 15 amps on the high side. They could easily push the SAME amps that the 12VHPWR standard does (about 9 amps) safely. ALL these news articles are FAKE NEWS. probably paid for by nvidia.