Updates to our Terms of Use

We are updating our Terms of Use. Please carefully review the updated Terms before proceeding to our website.

Wednesday, July 3, 2024 | Back issues
Courthouse News Service Courthouse News Service

AI technology may outgrow global energy demands, expert says

As AI becomes more efficient and widespread, it could soon require as much electricity as a small country.

(CN) — Artificial intelligence or AI may promise several benefits for productivity, though one industry expert on Tuesday warns that widespread usage of the technology may outpace power demands of some countries.

In a commentary piece in the journal Joule, Alex de Vries, a Ph.D. candidate at Vrije Universiteit Amsterdam and founder of Dichotomist, a research company that explores the unintended consequences of digital trends, discusses AI’s rapid growth since 2022 and how it has influenced everyday tasks like coding, writing and even driving a car.  

“Looking at the growing demand for AI service, it’s very likely that energy consumption related to AI will significantly increase in the coming years,” de Vries said in a statement.

The most rapid expansion has come in the form of generative AI, a tool that allows users to generate new text, images, videos and other data using the simplest demands. OpenAI’s DALL-E and ChatGPT are leading examples, the latter representing a conversational AI chatbot that reached 100 million users within two months and became especially popular among high school students.

Many have expressed concerns about generative AI plagiarism, but that hasn't deterred companies like Alphabet, Microsoft and Facebook from quickly developing their own AI platforms. The technology's quick expansion is problematic, de Vries argues, as its underlying processes are potentially costly to the environment.

Tools like ChatGPT and DALL-E use natural language processing, and have an initial training phase and an interference phase, both expending large amounts of energy due to the vast amount of data they require for specific outputs, de Vries explains.

During the training phase, an AI model is fed large datasets to adjust its initial parameters to align a predicted output with a target output. For large language models or LLMs like ChatGPT, de Vries says, this process allows an AI model to predict specific words or sentences based on context, guiding its behavior.

Training alone consumes a lot of energy, with de Vries noting how some large language models use anywhere from 324 to 1,287 megawatt-hours of electricity during that stage. For context, an AI tool consuming 433 megawatt-hours is enough to power 40 average American homes annually.

“Each of these LLMs, including GPT-3, was trained on terabytes of data and has 175 billion or more parameters,” de Vries writes.

When a model begins the inference phase, it generates outputs from new data. For ChatGPT, this is when the chatbot creates new outputs from live user responses — a process that consumes an estimated 564 megawatt-hours of electricity daily.

But while companies are working to make AI technology more energy efficient, de Vries warns that this may only increase its demand even more.

“The result of making these tools more efficient and accessible can be that we just allow more applications of it and more people to use it,” said de Vries.

One example comes from Google, which is currently incorporating generative AI in its email service and search engine. Given that Google’s search engine already processes 9 billion searches a day, de Vries estimates that utilizing AI would require roughly 29.2 terawatt-hours of electricity per year — a 60% increase from the company’s energy consumption in 2021.

“The worst-case scenario suggests Google’s AI alone could consume as much electricity as a country such as Ireland,” de Vries writes, adding that this scenario also assumes “full-scale AI adoption utilizing current hardware and software, which is unlikely to happen rapidly.”

The high costs and short supply of AI servers will keep the odds of this scenario low for now. However, AI server production is expected to grow and raise AI-related electricity consumption by 85 to 134 terawatt-hours annually within the next three years.

Adding to this worry is that by increasing AI efficiency, developers may be able to repurpose certain computer processing chips for AI, further increasing energy consumption.

To enhance transparency on AI’s environmental impacts, de Vries writes that regulators may want to consider introducing disclosure requirements. The author also notes that developers could think critically about whether certain applications would truly benefit from AI.

“The potential growth highlights that we need to be very mindful about what we use AI for. It’s energy intensive, so we don't want to put it in all kinds of things where we don’t actually need it,” de Vries said.

Follow @alannamayhampdx
Categories / Energy, Science

Subscribe to Closing Arguments

Sign up for new weekly newsletter Closing Arguments to get the latest about ongoing trials, major litigation and hot cases and rulings in courthouses around the U.S. and the world.

Loading...