Artificial intelligence can account for nearly half of data center power by the end of the year
According to new analysis, AI systems can account for half of data center power consumption by the end of this year.
Alex de Vries-Gao, founder of the Digital Intellectuals Technology Sustainability Website, estimates are the International Energy Agency (IEA) forecast that by the end of this decade, AI needs almost as much energy as Japan today.
De Vries-Gao's computing will be published in the Journal of Sustainable Energy, which is based on the capabilities consumed by NVIDIA and advanced micro devices, which are used to train and operate AI models. The paper also considers the energy consumption of chips used by other companies such as Broadcom.
The IEA estimates that all data centers (excluding mining for cryptocurrencies) consumed 415 tons of electricity last year. De Vries-Gao pointed out in his research that AI already accounts for 20% of that total.
De Vries-Gao said his computing has many variables, such as the energy efficiency of data centers and the power consumption associated with servers that handle the busy workload of AI systems. Data centers are the central nervous system of AI technology, and their high energy demands make sustainability a key issue in the development and use of artificial intelligence systems.
By the end of 2025, De Vries-Gao estimates that the energy consumption of AI systems can be close to up to 49% of the total power consumption of data centers, again excluding crypto mines. Research estimates that AI consumption may reach 23 gigawatts (GW), twice the total energy consumption in the Netherlands.
However, De Vries-Gao said many factors could lead to a slowdown in hardware demand, such as weakening demand for applications such as ChatGpt. Another problem could be geopolitical tensions that lead to restrictions on the production of AI hardware, such as export controls. de vries-gao cites examples of China's barriers to chips, which helps release the DeepSeek R1 AI model, which is reported to be less chip-based.
“These innovations can reduce the computing and energy costs of AI,” De Vries-Gao said.
But he said any efficiency improvement could encourage more AI use. Multiple countries trying to build their own AI systems—a trend called “sovereign AI”—can also increase hardware demand. De Vries-Gao also noted that Crusoe Energy, a U.S. data center startup, has obtained 4.5GW of aerodynamic energy capacity for its infrastructure, while Chatgpt developer OpenAI is among the prospects through its Stargate joint venture.
“There are early signs that these [Stargate] Data centers may intensify reliance on fossil fuels,” De Vries-Gao wrote.
On Thursday, Openai announced the launch of a Stargate project in the first United Arab Emirates outside the United States.
Microsoft and Google acknowledged last year that their AI drivers are jeopardizing their ability to achieve their internal environmental goals.
De Vries-Gao said information about the demand for AI power is becoming increasingly scarce, and analysts describe it as an “opacity industry.” The EU AI Act requires AI companies to disclose the energy consumption behind the training model, rather than for daily use.
Professor Adam Sobey, director of sustainability missions at the Alan Turing Institute, an AI research institute in the UK, said that the energy consumed by artificial intelligence systems requires more transparency and the potential savings of the carbon-emitting industry by making carbon emit carbon emit such as transportation and energy.
“I doubt we don't need many good use cases,” Sobey said. [of AI] to offset the energy used by the front end. ”