当前位置:首页 > Contact > 正文

AI Is Everywhere Now—and It’s Sucking Up a Lot of Water

2024-12-27 17:12:58 Contact

From our collaborating partner “Living on Earth,” public radio’s environmental news magazine, an interview by Aynsley O’Neill with Shaolei Ren, associate professor of electrical and computer engineering at the University of California, Riverside

Artificial intelligence has become a part of everyday life, but there’s little regulation thus far of its deployment and use. Currently, there’s no law on the books in the U.S that requires AI companies to disclose their environmental impact in terms of energy and water use. Concerned researchers rely on voluntary data from companies like Apple, Meta and Microsoft.

But research is showing that AI generation may be even more resource-intensive than originally thought. Imagine that you want to ask an AI program to write up a 100-word email for you. You get an almost instant response, but what you don’t see are the intensive computing resources that went into creating that email. At the AI data center, generating just two of those emails could use as much energy as a full charge on the latest iPhone. And according to a Pew Research Center study, that 100-word email could use up a whole bottle of water for the cooling that’s needed at data centers.

This interview with Shaolei Ren, associate professor of electrical and computer engineering at the University of California, Riverside, has been edited for length and clarity.

Explore the latest news about what’s at stake for the climate during this election season.

Read

AYNSLEY O’NEILL: For those of us who are unfamiliar with the technical aspects of how AI works, why does it take so much more energy and so much more water than anything else you do on your computer? 

SHAOLEI REN: Well, because a large language model is by definition just really large. Each model has several billions of parameters, or even hundreds of billions of parameters. Let’s say you have 10 billion parameters to generate one token or one word: You’re going to go through 20 billion calculations. That’s a very energy-intensive process. This energy is converted into heat, so we need to get rid of the heat; water evaporation is one of the most efficient ways to cool down the data center facilities. That’s why we also use a lot of water besides the energy. 

The water evaporates into the atmosphere, so sometimes that’s considered as lost water, although it technically is still within our global water cycle system, but it’s not available for reuse in the short term in the same source. Water consumption is the difference between the water withdrawal minus the water discharge, and that’s very different from the water that we use to take a shower. When you take a shower, you withdraw a lot of water, but there’s not much consumption. 

O’NEILL: From what I understand in the United States, at least, the water that’s used in these AI data centers to do this cooling comes from local or municipal sources. What impact does an AI data center have on the local community around it? 

REN: In the U.S., roughly 80 percent to 90 percent of the water consumption for data centers is coming from [public] water sources. We did some preliminary study and it shows that currently in the U.S. the data centers’ water consumption is already roughly 2 percent to 3 percent of the public water supplies. So here we’re talking about consumption, not water withdrawal. Based on the estimate by EPRI [Electric Power Research Institute], AI energy demand by [the] 2030s could go up to 8 percent.

O’NEILL: There’s an ongoing debate in Memphis, where tech billionaire Elon Musk is trying to build a massive server to accommodate AI. The local utility estimates that this system is going to need something like a million gallons of water per day to cool it. From your perspective, how should local communities weigh the benefits versus the cost of having these local AI data centers?

Shaolei Ren, associate professor of electrical and computer engineering at the University of California, Riverside

REN: I think there are benefits, especially in terms of economic development. For example, the data center construction will bring some tax revenues, and after the completion, there will be a steady stream of tax dollars for the local government. 

But on the other hand, the natural resources of millions of gallons of water a day could be an issue. Right now, I heard that the local water utility says it takes up 1 percent of the total water supply. But I would say probably they are comparing water consumption with the water withdrawal, because they supply the water to residents, to other industries, but [that] usage is mostly water withdrawal because they just return the water immediately back to the supply. However, when a data center takes the water, most of the water will be evaporated. So it’s not really the right metric to compare. This 1 percent water withdrawal by data centers could mean that the water consumption is roughly 5 percent to 10 percent.

O’NEIL: Now, the companies that created and operate these AI systems themselves have an interest in making the technology more efficient. What kind of possible improvements in AI technology could make it more energy or water efficient over time?

REN: They definitely have the incentive to reduce the energy consumption, reduce the resource consumption for training and inference. We have seen a lot of research proposals and solutions that promise to reduce the energy consumption, but it turns out that in reality, the systems are not that optimized. 

I saw a paper from a top tech company’s research team, and it shows that the energy consumption is 10 times higher than what we thought before, even though they’re using state-of-the-art optimization techniques. So they do have the incentive to reduce the energy and resource usage for AI computing. 

However, the real world is a different story, partly because they have strict service-level objectives to meet, which means they need to return the responses to users in a short amount of time—and that limits how well they could optimize their system. If they are just doing batch processing, they could be very energy efficient, but it turns out, in reality, there are a lot of constraints which prohibit them from using those optimization techniques. 

Maybe we can compare the bus versus a passenger car. In general, per passenger, the bus should be more energy efficient than a passenger car, assuming the bus is fully loaded. But in reality, due to the user requests, random patterns and some other constraints, the bus is not fully loaded at all. If you have a 50-passenger bus, usually it’s just loaded for five passengers, and on average per passenger, fuel efficiency is much worse than the passenger car.

O’NEILL: AI has become a really immense part of many people’s day-to-day lives. It’s supposed to make our lives easier, but it comes at this sort of tremendous cost to the environment. What’s the solution here? If the technological advances aren’t working out the way we’re hoping they will, what’s the fix?

REN: One potential fix is, instead of using larger and larger models, we could be using smaller and smaller models, because usually those smaller models are good enough to complete many of the tasks that we actually care about. 

For example, if you just want to know the weather, or a summary of text, using a smaller model is usually good enough, and a smaller model means you’re going to save a lot of resources and energy consumption. Sometimes you could even render small models on your cell phone, and that can further save the energy by, say, 80 percent very easily, compared to running a larger model on the cloud.

About This Story

Perhaps you noticed: This story, like all the news we publish, is free to read. That’s because Inside Climate News is a 501c3 nonprofit organization. We do not charge a subscription fee, lock our news behind a paywall, or clutter our website with ads. We make our news on climate and the environment freely available to you and anyone who wants it.

That’s not all. We also share our news for free with scores of other media organizations around the country. Many of them can’t afford to do environmental journalism of their own. We’ve built bureaus from coast to coast to report local stories, collaborate with local newsrooms and co-publish articles so that this vital work is shared as widely as possible.

Two of us launched ICN in 2007. Six years later we earned a Pulitzer Prize for National Reporting, and now we run the oldest and largest dedicated climate newsroom in the nation. We tell the story in all its complexity. We hold polluters accountable. We expose environmental injustice. We debunk misinformation. We scrutinize solutions and inspire action.

Donations from readers like you fund every aspect of what we do. If you don’t already, will you support our ongoing work, our reporting on the biggest crisis facing our planet, and help us reach even more readers in more places?

Please take a moment to make a tax-deductible donation. Every one of them makes a difference.

Thank you,

David Sassoon
Founder and Publisher

Vernon Loeb
Executive Editor

Share this article

最近关注

友情链接