Artificial intelligence is not only powerful but also extremely energy-hungry. ChatGPT, in particular, requires massive amounts of electricity, both to process the millions of daily requests from its users and for its training.

According to the study How Hungry is AI?, the daily requests of ChatGPT account for approximately 2.5 billion interactions, equating to nearly 7,832 GWh per year, the equivalent of the annual consumption of 1.6 million French households.

However, the most significant environmental impact lies in the training of the models, which is essential for enabling them to comprehend data and generate relevant responses. The initial training of a model like GPT-4 would have consumed nearly 50 GWh, which is as much as San Francisco uses in three days or the annual consumption of about 4,000 French households. In more tangible terms, this would correspond to charging a smartphone 3.3 billion times.

This consumption is not limited to training. Running ChatGPT in environments such as Google Search requires hundreds of thousands of high-performance servers, further exacerbating the ecological impact. The precise calculation remains difficult, as tech giants share little data on the actual consumption of their data centers.

These figures highlight the major environmental challenges associated with artificial intelligence, urging a reconsideration of how these technologies are developed and deployed.

Leave A Reply

Exit mobile version