close
close

How AI’s Insatiable Energy Demand Threatens Big Tech’s Climate Goals – Mother Jones

The Google data center sits across from a cluster of wind turbines.

Google data center in the Netherlands.Utrecht, Robin/Action Press/ZUMA Press

This story was originally published by guardian and is reproduced here as part of Climate Office cooperation.

Artificial Intelligence Boom has driven the share prices of big tech companies to new records, but it has come at the expense of the sector’s climate aspirations.

Google acknowledged Tuesday that the technology threatens its environmental goals after disclosing that its data centers, a key part of its AI infrastructure, had helped increase its greenhouse gas emissions by 48 percent since 2019. It said the “significant uncertainty” around achieving its 2030 net-zero emissions goal — reducing the total amount of CO2 emissions it is responsible for to zero — included “uncertainties about the future environmental impacts of AI, which are complex and difficult to predict.”

The company follows Microsoft, OpenAI’s biggest financial backer, the creator of ChatGPT, which has admitted that its “bullseye” of achieving net zero emissions by 2030 may not succeed because of its AI-based strategy.

So will technology be able to reduce the negative impact of AI on the environment, or will the industry continue as it is because the reward for dominance is so great?

Why does AI threaten the ecological goals of technology?

Data centers are a fundamental part of training and running AI models, such as Google’s Gemini or OpenAI’s GPT-4. They house sophisticated computer hardware or servers that process the vast amounts of data that underpin AI systems. They require large amounts of electricity to operate, which generates CO2 depending on the energy source, as well as creating “embedded” CO2 from the costs of manufacturing and transporting the necessary equipment.

According to the International Energy Agency, total electricity consumption in data centres could double from 2022 levels to 1,000 TWh (terawatt-hours) in 2026, equivalent to the energy demand of Japan, while research firm SemiAnalysis calculates that AI will help data centres use 4.5 per cent of global energy by 2030. Water consumption is also significant, with one study estimating that AI could be responsible for using up to 6.6 billion cubic metres of water by 2027 – almost two-thirds of England’s annual consumption.

What do experts say about the environmental impact?

A recent UK government report on AI safety found that the carbon intensity of the energy source used by tech companies is a “key variable” in calculating the environmental costs of technology, but added that a “significant proportion” of AI model training still relies on fossil fuel energy.

Indeed, tech companies are buying renewable energy contracts in an attempt to meet their environmental goals. Amazon, for example, is the world’s largest corporate buyer of renewable energy. But some experts say that’s pushing other energy users toward fossil fuels because there isn’t enough clean energy for everyone.

“Not only is energy consumption rising, but Google is also grappling with the need to meet this increased demand with sustainable energy sources,” says Alex de Vries, founder of Digiconomist, a website that tracks the environmental impact of new technologies.

Is there enough renewable energy for everyone?

Global governments plan to triple the world’s renewable energy resources by the end of the decade to cut fossil fuel use in line with climate goals. But the ambitious commitment, agreed at last year’s COP28 climate talks, is already in doubt, with experts worried that the surge in energy demand from AI data centres could push them even further off target.

The International Energy Agency (IEA), the global energy watchdog, has warned that while global renewable energy capacity will grow at the fastest rate in 20 years in 2023, the world could only double its renewable energy share by 2030 under current government plans.

In response to AI’s energy needs, technology companies may be investing more in new renewable energy projects to meet the growing energy demand.

How quickly will we be able to build new renewable energy projects?

Onshore renewable energy projects, such as wind and solar farms, are relatively quick to build—they can take less than six months to develop. But slow planning rules in many developed countries, as well as a global gridlock in connecting new projects, can add years to the process. Offshore wind farms and hydroelectric projects face similar challenges, in addition to construction times of two to five years.

That has raised concerns about whether renewable energy sources will be able to keep up with the expansion of artificial intelligence. According to data, big tech companies have already used a third of America’s nuclear power plants to provide low-carbon electricity to their data centers. “Wall Street Journal”But without investing in new energy sources, these transactions will result in the diversion of low-carbon electricity from other users, leading to greater use of fossil fuels to meet overall demand.

Will AI’s demand for electricity grow forever?

The usual rules of supply and demand would suggest that as AI uses more electricity, energy costs would rise and the industry would be forced to economize. But the unique nature of the industry means that the world’s largest companies could instead choose to ride roughshod over spikes in electricity costs, burning through billions of dollars as a result.

The largest and most expensive data centers in the AI ​​sector are those used to train “frontier” AIs, systems like GPT-4o and Claude 3.5, which are more powerful and efficient than any others. The leader in this space has changed over the years, but OpenAI is generally near the top, jockeying for position with Anthropic, the maker of Claude, and Google’s Gemini.

Frontier competition is already considered a winner-take-all, with little stopping customers from moving to the newest leader. That means if one company spends $100 million training a new AI system, its competitors must decide whether to spend even more or drop out of the race altogether.

Worse still, the race for so-called “AGIs,” AI systems that can do anything a human can do, means it’s worth spending hundreds of billions of dollars on a single training cycle — if it leads your company to monopolize a technology that, as OpenAI says, could “elevate humanity.”

Won’t AI companies learn to use less electricity?

Every month, new breakthroughs in AI technology emerge that enable companies to do more with less. For example, in March 2022, DeepMind’s Chinchilla project showed researchers how to train pioneering AI models using radically less computing power by changing the ratio between the amount of training data and the size of the resulting model.

But this did not cause the same AI systems to use less electricity; instead, it resulted in the same amount of electricity being used to create even better AI systems. In economics, this phenomenon is known as “Jevons’ paradox,” after the economist who observed that James Watt’s improvement of the steam engine, which allowed for the use of much less coal, instead led to a huge increase in the amount of fossil fuel burned in England. As the price of steam power fell after Watt’s invention, new uses were discovered that would not have been profitable if the energy had been expensive.