close
close

Is artificial intelligence a serious burden on the world’s energy resources?

Data centers, especially those that run artificial intelligence programs, are driving growing demand for electricity

Data centers, especially those running artificial intelligence programs, are driving increasing demand for electricity.

When Google announced this week that its carbon emissions had increased by 48 percent since 2019, it blamed artificial intelligence.

American technology companies are building vast networks of data centers around the world and say artificial intelligence is driving growth, shining a light on how much energy the technology draws and its impact on the environment.

How does artificial intelligence use electricity?

Whenever a user sends a query to a chatbot or AI tool, the query is sent to the data center.

Already, enormous computing power is needed to develop artificial intelligence programs, known as large language models (LLMs).

Meanwhile, computers are using more and more electricity and servers are getting hotter, which means more electricity is needed to cool them.

The International Energy Agency (IEA) said in a report earlier this year that data centers use on average about 40 percent of their electricity for computing and 40 percent for cooling.

Why are experts concerned?

Since OpenAI launched its ChatGPT bot in late 2022, big tech companies have been rushing to equip all their products with AI.

Many experts fear that these new products will cause a sharp increase in electricity consumption.

First, AI-based services require more power than their non-AI counterparts.

For example, various studies have shown that each query sent to ChatGPT uses about 10 times more processing power than a single Google search.

So if Google were to switch all of its search queries to AI (around nine billion per year), it could significantly increase the company’s electricity consumption.

Most of these new services and products are based on LLM studies.

Programming these algorithms is extremely time-consuming and usually requires the use of very powerful computer systems.

They, in turn, require better cooling, which consumes more electricity.

How much energy does artificial intelligence use?

Before the era of AI, estimates suggested that data centers accounted for about one percent of global electricity demand.

The IEA report estimates that in 2022, data centers, cryptocurrencies, and AI will consume a combined 460 TWh of electricity worldwide, which is almost two percent of total global electricity demand.

The IEA estimated that by 2026 this number could double and be equal to the number used in Japan.

Alex De Vries, a researcher at the Digiconomist website, modeled the electricity consumption of AI alone, focusing on the sales forecasts of the American company NVIDIA, which has a monopoly on the market for AI-specialized servers.

Late last year, he concluded in an article that if NVIDIA’s sales forecasts for 2023 prove correct and all of these servers are operating at full capacity, they alone could be responsible for annual electricity consumption of 85.4–134.0 TWh—as large as Argentina or Sweden.

“The numbers I gave in the article were already conservative from the start because I couldn’t take into account things like cooling requirements,” he told AFP.

He also added that NVIDIA server deployments have exceeded last year’s forecasts, so the numbers are sure to be higher.

How are data centers doing?

Fabrice Coquio of Digital Realty, a data centre company that leases its services to others, told AFP in April during a visit to one of its massive facilities north of Paris that artificial intelligence would transform his industry.

“It will be exactly the same (as the cloud), maybe a little more extensive in terms of implementation,” he said.

Part of Digital Realty’s newest data centre in Courneuve – a giant building resembling a football stadium – will be dedicated to artificial intelligence.

Coquio explained that normal computing tasks can be handled using server racks placed in rooms equipped with efficient air conditioning.

However, AI racks use much more powerful components, run much hotter and require water to be physically pumped into the equipment, he added.

“It certainly requires different servers, storage devices and communications equipment,” Coquio said.

Is this sustainable?

The biggest players in the AI ​​and data center markets — Amazon, Google, and Microsoft — are trying to reduce their carbon footprint by buying massive amounts of renewable energy.

Amazon spokesman Prasad Kalyanaraman told AFP that the company’s data centre division, AWS, is “the largest buyer of renewable energy in the world”.

AWS has committed to achieving net zero carbon emissions by 2040. Google and Microsoft have committed to achieving this goal by 2030.

However, building new data centers and increasing energy use in existing ones will not help achieve green energy goals.

Google and Microsoft have both said in their latest reports that their greenhouse gas emissions have increased over the past few years.

Google grew 48 percent compared to 2019, and Microsoft grew 30 percent compared to 2020.

Both countries blame artificial intelligence.

Microsoft CEO Brad Smith told Bloomberg in May that the promise was a “moonshot” ahead of the “explosion” of artificial intelligence, adding that “the moon is five times farther away than it was in 2020.”

© 2024 AFP

Quote:Is Artificial Intelligence a Major Burden on World Energy? (2024, July 5) retrieved July 6, 2024, from https://techxplore.com/news/2024-07-ai-major-world-energy.html

This document is subject to copyright. Apart from any fair use for private study or research, no part may be reproduced without written permission. The content is provided for informational purposes only.