- ChatGPT and other language models are becoming more and more popular
- People use them for fun and more and more
- However, each search is very energy intensive
AI is the future. It’s already obvious. Different language models or video, photo or music generators are constantly increasing. And a lot of people use them just for fun or even instead of Google. But few people realize how much traffic these models are energy demanding. And not only what concerns electricityale i water consumption.
So that they can language models or in general AI to function and respond correctly to our questions and prompts, huge data centers are needed for this. And they have to be powered by a huge amount of energy. Plus they have to be chilled. This is reflected not only in the ever-increasing consumption of water, but also of electricity. Moreover, according to predictions, consumption will grow in the future in both cases. And so the sustainability of the operation is called for. Is this even possible?
1 search with ChatGPT = 2.9 Wh of electricity
Every single search using a model like chatGPT is unexpectedly energy intensive. Experts have calculated that one of your questions is needed 2.9 Wh of electricity. For comparison – if you ask Google, the electricity consumption is about 10 times lower.
And since people are not going to stop with it (quite the contrary), the company’s recent prediction Goldman Sachs claims that while US electricity use grew by less than 0.5% per year in the previous 20 years, it will grow by at least 2.4% per year from 2024 to 2030.
It is therefore expected that the global demand for electricity will also grow in the coming years. It could be in 2026 twice as large as in 2022.
Huge water consumption
The second “side effect” of using generative models and artificial intelligence is a higher demand for water. It is used to cool data centers. And its quantity is incredible.
An incredible amount of water is needed to train a language model. According to some studies, only GPT training resulted in GPT-3 up to 700,000 liters of water. Each of your conversations with such a language model (an average of 10 to 50 questions) means the consumption of approx. 0.5 liters of fresh water for cooling datacenter.
Between 2021 and 2022, according to Microsoft, global water consumption will increase by 34%. Additionally, in December 2024, it was disclosed that language models from OpenAI use 300 million users per week and send them billion messages a day. If you would like to calculate the water consumption, you will come to incredible numbers.
Large technology companies are therefore thinking about how to reduce energy consumption, how upgrade servers or how to use more renewable resources. That is why, in addition to various warnings about disasters caused by the possible dominance of artificial intelligence, sustainability is also a big topic. The supply of water in the world is not infinite. And the increasingly high consumption of electricity will have to be addressed.
Do you use language models for fun?
Source: www.svetandroida.cz