DC/DC units become less efficient at higher input voltages. This is because MOSFETs always spend some time in semiconductor mode. As the supply voltage rises, these MOSFETs will consume more power during this transition period. The goal of gate drivers is to allow the MOSFET to spend as little time as possible in this region. In order to make the transition as fast as possible, MOSFETs must, in turn, require the smallest possible gate charge for their operation.
It is certainly possible to design a VRM for 24V or 48V, but you will need a unique set of parts. The entire industry is set to 12V, which will undoubtedly bring price and size advantages.
Also, the losses in the VRM are usually much greater than the power losses of the cables and connectors. Say you get a 100W VRM unit with 95% efficiency (very clean design) then about 5W will be lost there. 105W power is then supplied at, say, 12V, which corresponds to 8.75A, so it’s good for two VCC/GND pins. If you want to lose the same amount of power in transit, you also have to waste 5 watts there, which is what happens with a full cable + connectors series resistance of 65m ohms. PCIe connectors are usually made with 18AWG wire, which has a series impedance of up to 21Ohm/km. You will then only experience a 5W loss if you have 3m of wire (1.5m extension cables), as you also mess up the cables in such a way that only one VCC/GND pair is connected instead of the usual 2 or 3.
So no, the carriage more than half a meter in the computer case is not the problem. Provided, of course, that you correctly connect all the connectors.
[Reactie gewijzigd door Hans1990 op 5 juli 2023 17:38]
“Professional web ninja. Certified gamer. Avid zombie geek. Hipster-friendly baconaholic.”