close
close

AI Drive Brings Microsoft’s ‘Green Moon’ to Earth in West London | Technology sector

If you want proof of Microsoft’s progress towards its green goal, look closer to Earth: to a construction site on an industrial estate in west London.

The company’s Park Royal data centre is part of its commitment to driving artificial intelligence (AI) forward, but that ambition runs counter to its goal of becoming carbon negative by 2030.

Microsoft says the center will be powered entirely by renewable energy. But building the data centers and the servers they fill means the company’s scope 3 emissions — such as CO2 related to the materials in its buildings and the electricity people use when using products like Xbox – are more than 30% higher than in 2020. As a result, the company exceeds its overall emissions target by about the same amount.

This week, Microsoft co-founder Bill Gates said artificial intelligence will help fight climate change because big tech companies are “really willing” to pay more to use clean electricity sources so “they can say they’re using green energy.”

In the short term, AI has been a problem for Microsoft’s green goals. Microsoft’s outspoken CEO Brad Smith once called Microsoft’s carbon ambitions a “moonshot.” ​​In May, stretching the metaphor to its limits, he admitted that “the moon has moved” because of his AI strategy. It plans to spend £2.5bn over the next three years building AI data centre infrastructure in the UK, and this year has announced new data centre projects around the world, including in the US, Japan, Spain and Germany.

Training and operating the AI ​​models that underpin products like OpenAI’s ChatGPT and Google’s Gemini uses a lot of electricity to power and cool the associated hardware, and additional carbon is produced by manufacturing and transporting the associated equipment.

“It’s a technology that increases energy consumption,” says Alex de Vries, founder of Digiconomist, a website that monitors the environmental impact of new technologies.

The International Energy Agency estimates that total electricity consumption in data centers could double from 2022 levels to 1,000 TWh (terawatt hours) in 2026, equivalent to Japan’s energy demand. According to calculations by research company SemiAnalysis, AI will cause data centers to use 4.5% of global energy production by 2030.

.

This means that among concerns about the impact of artificial intelligence on jobs and human lifespan, there is also the issue of the environment. Last week, the International Monetary Fund said governments should consider imposing carbon taxes to account for the environmental costs of AI, either in the form of an overall carbon levy that covers emissions from servers within its scope, or in other ways , such as a special tax on CO-OPERATIONS2 generated by this equipment.

All the big tech companies involved in AI – Meta, Google, Amazon, Microsoft – are looking for renewable energy resources to meet their climate goals. In January, Amazon, the world’s largest corporate buyer of renewable energy, announced it had bought more than half of the output of an offshore wind farm in Scotland, while Microsoft said in May it would support $10 billion (£7.9 billion) of renewable energy projects . Google intends to completely power its data centers with emission-free energy by 2030.

A Microsoft spokesman said: “We remain steadfast in our commitment to our climate goals.”

Microsoft co-founder Bill Gates, who left in 2020 but retains a stake in the company through the Gates Foundation Trust, has argued that artificial intelligence can directly help combat climate change. The extra electricity demand will be offset by new investments in green generation, he said Thursday, more than offsetting consumption.

A recent government-backed report found that “the carbon intensity of an energy source is a key variable” when calculating AI emissions, although it added that “a significant proportion of AI training around the world still relies on high-carbon sources like coal or natural gas”. Water needed to cool servers is also an issue. One study estimated that AI could use up to 6.6 billion cubic metres of water by 2027, almost two-thirds of England’s annual water consumption.

De Vries argues that the pursuit of sustainable computing power is putting a strain on demand for renewable energy, which will force fossil fuels to fill gaps in other sections of the global economy.

“Greater energy consumption means we don’t have enough renewable energy sources to cover this increase,” he says.

Server rooms in data centers are energy-intensive. Photo: i3D_VR/Getty Images/iStockphoto

NexGen Cloud, a British company that provides sustainable cloud computing solutions, a data center-based industry that delivers IT services such as data storage and computing power over the internet, says renewable energy sources for AI computing are available to data centers as long as they avoid cities and are located near sources of hydroelectric or geothermal power.

Youlian Tzanev, co-founder of NexGen Cloud, says:

“The norm in the industry is to build around economic centers, not renewable energy sources.”

This makes it harder for any tech company focusing on AI to meet carbon emissions targets. Amazon, the world’s largest cloud services provider, aims to achieve net zero – removing as much carbon dioxide as it emits – by 2040 and adapting its global electricity consumption to 100% renewable energy by 2025. Google and Meta aim to the same net zero target by 2030. OpenAI, the creator of ChatGPT, uses Microsoft data centers to train and operate its products.

There are two main ways in which large language models – the technology that underpins chatbots like ChatGPT or Gemini – consume energy. The first is the training phase, in which the model receives a wealth of data collected from the Internet and other sources, and then builds a statistical understanding of the language itself, which ultimately enables it to produce convincing-looking responses to queries.

The initial energy cost needed to train AI is astronomical. This stops smaller companies (and even smaller governments) from competing in the sector if they don’t have a spare $100 million to spend on training. However, this is a small cost compared to the cost of actually running the resulting models – a process known as “inference.” According to analyst Brent Thill of investment firm Jefferies, 90% of AI’s energy costs are in the inference phase: the electricity consumed when people ask the AI ​​system to answer fact-based queries, summarize a passage of text, or write an academic paper.

The electricity used for training and inference is run through a vast and growing digital infrastructure. Data centers are filled with servers that are built from scratch for the specific part of the AI ​​workload in which they reside. A single training server may have a central processing unit (CPU) barely more powerful than the one in your computer, combined with dozens of specialized graphics processing units (GPUs) or tensor processing units (TPUs) – microprocessors designed to quickly breeze through huge amounts of simple calculations. that AI models consist of.

If you’re using a chatbot and you’re watching it spit out answers word by word, a powerful GPU uses about a quarter of the power it takes to boil a kettle. All of this is hosted by a data center, whether owned by the AI ​​vendor itself or a third party—in which case you could call it “the cloud,” a fancy name for someone else’s computer.

SemiAnalytics estimates that if generative AI were integrated into every Google search engine, it could translate into an annual energy consumption of 29.2 TWh, which is comparable to what Ireland consumes annually, although the financial costs for the technology company would be prohibitive. This has led to speculation that the search firm may start charging for some of its AI tools.

But some argue that looking at the energy overhead of AI is the wrong lens. Instead, consider the energy that new tools can save. A provocative paper in the peer-reviewed journal Nature’s Scientific Reports earlier this year argued that the carbon footprint of writing and illustrating is lower for AI than for humans.

Artificial intelligence systems emit “130 to 1,500 times” less carbon dioxide per page of generated text compared to humans, and even 2,900 times less per page of generated image.

Of course, what these human writers and illustrators do instead is not said. Redirecting and retraining their work in another field – such as green jobs – could be the next best thing.