More

    AI Data Centers: What to Know About Their Water and Energy Use

    When people find out I’m a journalist who covers AI, they often ask about the drastic energy consumption of AI data centers. Are these centers using up all of our drinking water? How is this tech affecting the environment? Is AI going to kill us all? The questions range from curious to downright dystopian.

    Sam Altman, the CEO of OpenAI, recently faced criticism after calling some of these concerns, particularly those around water, «totally fake.» It all stems from a Q&A session hosted by The Indian Express newspaper. Around the 26-minute mark of the interview, Altman was asked to defend certain criticisms of AI, including the amount of natural resources it takes to power large language models like ChatGPT.

    Altman responded, «(criticism of AI for overuse of) water is totally fake,» saying that while extreme water use «used to be true,» OpenAI no longer does evaporative cooling. He said estimates that 17 gallons of water are used for every chatbot query are no longer accurate.

    «This is completely untrue and totally insane, [and has] no connection to reality,» he said. He then goes on to address AI energy consumption, calling the concerns «fair» but arguing that it should be evaluated as a whole, not per query, since some queries, like videos, are more intensive to generate than text conversations. (Disclosure: Ziff Davis, CNET’s parent company, in 2025 filed a lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.)

    Still, Altman says, «we need to move toward nuclear or wind and solar (power) very quickly.»

    Questions involving data centers and water are complicated.

    Do AI data centers strain land and power systems?

    Altman’s remarks come amid timely, ongoing debates over data centers and their energy use.

    CNET’s Corin Cesaric dove into the issue of AI’s energy use last year and found the cost of training and running ChatGPT, Gemini, Claude and other generative AI tools to be «staggering.» The US accounted for the largest share (45%) of global data center electricity consumption in 2024, according to the International Energy Agency.

    As for water: Two Google data centers in Council Bluffs, Iowa, alone used 1.4 billion gallons of water in 2024, enough to fill about 28 million standard bathtubs. Google has 29 data centers worldwide. Meta’s data centers also accounted for about 1.39 billion gallons of water used in 2023.

    While we don’t currently have statistics from OpenAI, Meta, or Google on their natural resource consumption in 2025, it’s safe to bet that data center energy and water use will rise as more people use generative AI.

    How do AI data centers use water?

    Considering ChatGPT now has close to 1 billion weekly users, and OpenAI has estimated that it handles close to 2.5 billion prompts every day, that’s an astronomical amount of data to manage. And because of this demand, the powerful computers that train the AI models and process their prompts get extremely hot. Think of how your phone and laptop heat up when running demanding tasks. If servers overheat, they can slow down or become damaged. This is where water comes in.

    Traditionally, water in AI data centers is used in two ways: evaporative cooling (consuming water) and closed-loop systems (recirculating water).

    Evaporative cooling is a ventilation technique that uses the natural process of evaporation to convert liquid water into water vapor, which absorbs heat during the process. Closed-loop cooling is a more resource-efficient process that reuses the water to dissipate heat without evaporation or consumption.

    OpenAI said in a January announcement that it is «prioritizing closed-loop or low-water cooling systems» to minimize water use. This does lend credence to Altman’s recent claims that OpenAI’s water use is not as high as the 17 gallons per query estimate, but we don’t yet have exact figures for OpenAI’s 2025 water use.

    OpenAI says it is moving away from the more costly evaporative cooling systems. However, 56% of data centers still use this method in some form over closed-loop systems, according to a January 2026 report from global water technology company Xylem and market research firm Global Water Intelligence. The research anticipates that AI water consumption will spike nearly 130% by 2050.

    How much energy does AI use?

    Powering AI and these massive data centers is demanding.

    Generative AI chatbots use more energy than traditional search engines like Google or Bing. One estimate calculated that a single chatbot query requires 10 times more electricity than a Google search. On average, a single text query takes about 0.24 to 3 watt-hours, but AI-generated videos and images require much more electricity.

    An August 2025 report from Google details Gemini’s energy use. The report states «the median Gemini Apps text prompt uses 0.24 watt-hours (Wh) of energy, emits 0.03 grams of carbon dioxide equivalent (gCO2e) and consumes 0.26 milliliters (or about five drops) of water.» Google equates this energy consumption to powering a microwave for 9 seconds.

    Is solar a valid alternative?

    Even though AI models require 24/7 power, solar energy is a viable and scalable option for powering AI data centers.

    OpenAI announced a multi-billion-dollar venturein October 2025 to explore new energy generation with solar and battery storage. Meta, Microsoft, Google and Amazon all expanded their solar power use across the US in 2025.

    While renewable solutions could be the path forward, solar (or wind) energy is still only part of the mix of energy generation used by data centers. They generally rely on the grid itself, which is still largely powered by the burning of fossil fuels like natural gas.

    Where we stand

    The conversation around AI and water use is moving from unconfirmed claims to measured scrutiny. Communities and policymakers are now pushing for transparency and sustainable practices, aiming to ensure that AI’s rapid growth doesn’t come at the expense of local water resources or the local electricity grid. As AI continues to grow, so, too, will the debate about how best to balance technological innovation with environmental responsibility.

    Recent Articles

    spot_img

    Related Stories

    Stay on op - Ge the daily news in your inbox