Infographic: The Electricity Requirements of AI

One of the less discussed implications of artificial intelligence is the amount of energy required to power the necessary infrastructure.

According to recent calculations, training OpenAI’s GPT-4, which is powered by about 25,000 Nvidia A100 GPUs, required up to 62,000 megawatt hours. That’s equivalent to the energy demand of 1,000 U.S. homes for more than five years.

Meta’s new AI supercluster will feature 350,000 Nvidia H100 GPUs, while X and Google are building massive hardware projects to power their own models.

To provide some perspective on this, the Visual Capitalist team has created a Overview composed of Microsoft’s increasing electricity needs.

Source: www.emerce.nl