Study reveals that AI is terrible for the environment

The environmental footprint of artificial intelligence tools is beginning to worry experts. All because of the massive amounts of energy that learning models require for training and updating.

Artificial Intelligence is not ecological

Artificial intelligence (AI) systems are generating vast emissions – and the situation is getting worse, according to one new scientific study. The increasing energy required to train and run increasingly complex models, along with increasing interest in using them, is having serious environmental consequences, warns a new paper.

As these systems become more efficient, they require more computing power and, therefore, more energy to operate. For example, OpenAI's current GPT-4 uses 12 times more power than its predecessorr. Furthermore, systems training represents only a small part of your work. The energy used to actually operate the AI ​​tools is estimated at 960 times that used in a single training session.

The researchers suggest that the impact of these emissions could be vast. To the AI-related emissions could cost industry more than 10 billion euros per yearindicates the report, which calls on governments and regulators to standardize ways of measuring these emissions, as well as to create rules that ensure they are kept within acceptable limits.

The exponential growth of AI capabilities mirrors a worrying increase in its environmental impact. This study highlights the urgent need for the AI ​​industry to adopt greener practices and sustainable standards. Our goal is to provide policymakers with the data they need to tackle AI's carbon footprint through proactive regulations.

Said Meng Zhang, principal investigator at Zhejiang University.

The increasing use of energy and water by data centers as they expand to incorporate AI is causing concern among decision makers around the world. AI Training Data Centers can consume 10 to 100 MW on average, depending on scale. If we talk about training a Large Model (like the GPT-3): It is estimated that it consumes up to 1 GWh (gigawatt-hour) or more over several weeks of work.

The problem with data centers

AI systems, especially deep learning models, require enormous computational power to be trained. This computing power comes from data centers that require massive amounts of energy to operate and stay cool.

Os Complex algorithms used in AI go through millions or even billions of calculations before they are optimized, and this entire process consumes a lot of energy.

It is estimated that training a single advanced AI model could emit as much carbon dioxide as a car's total emissions over several years. In some cases, more robust models can emit the equivalent of five cars over their useful life. These numbers are especially worrying considering the speed at which new AI applications are being developed and adopted across industries.

A concrete example is the use of AI in natural language, such as GPT (Generative Pre-trained Transformer) models. These models require immense amounts of data and very high computational capacity, which makes them highly energy-consuming.

Furthermore, these models are trained and updated periodically to improve their accuracy, further increasing their energy consumption and, consequently, their carbon footprint.

With innovation and advancement in technology, it is possible for AI to become more efficient and sustainable over time. However, to achieve this goal, it is necessary for technology companies, governments and society to work together to create policies and practices that minimize their environmental impact.

Source: pplware.sapo.pt