The new Nvidia GeForce RTX 5090 graphics cards are not only expensive, but also quite demanding. A TGP of 575 watts and the requirements for a kilowatt power supply are indeed considerable. Is this really the right direction?
Sure, the Nvidia GeForce RTX 5090 isn’t a classic graphics card for gamers, that is, except for those with an unlimited budget and trying to show off that they have the latest, greatest, and most expensive. Otherwise, it is mainly intended for professionals, game developers, people who teach artificial intelligence models, work with VR, and the like. Neither consumption nor price is a crucial factor for them, because (let’s put it bluntly) graphics from the 1990s represent an even more affordable solution for them.
Nineties graphics represent a solution for professionals or rather for semi-professionals who have not yet reached for specialized graphics or specialized AI solutions. It’s kind of a jack-of-all-trades, a universal graphics card that combines high performance, lots of memory and is still playable, making it an ideal choice for PC game developers, for example. They will appreciate both the performance, which shows what next-generation graphics will look like, but also the large amount of memory, which allows working with non-optimized models and code.
But still…
Six hundred watts? Six hundred watts…
The Nvidia RTX 4090 had a TGP of 450 watts, a new power connector – and that too caused a lot of fun when it started to melt and kind of burn when plugged in the wrong way. And if I can believe the analysis of Gamers Nexus, this connector is basically at its theoretical limits and will not be improved in any way, so the problem with it may repeat itself. It’s just too much current that we’re trying to cram with too-narrow cables through connectors that have such limited contact that they start acting a bit like soldering irons because of the contact resistance.
And that’s not enough. Look at how small a chip we actually cram 600 watts into! In order to stuff such currents into the chip through its socket, it is not possible to let it all go through one pin, as was done with the first microprocessors, but through the entire set of power pins. (I couldn’t find out how many it physically has, all the sources are handled by just the stupid 12VHPWR power connector.)
The whole power cascade is not trivial, you can’t pump power into the chip, equivalent to a weaker hair dryer, through one pin, you have to split and coordinate it, because if your power cascade fails, very bad things will happen. Very unpleasant things also happen when, for example, you insert the wrong processor into the slot and the power goes to different pins than the designers expect. Again, Gamers Nexus handled this in a nice way in the analysis of how it is possible to see the CPU and board on the first try by misplacing the processor. This is just not fun!
Source: pctuning.cz