Every time you ask Google’s Gemini a query, it takes the same amount of energy as watching nine seconds of TV.
That’s what the tech giant says in a new report detailing the energy consumption, emissions and water use of its generative artificial intelligence that users turn to every day for everything from writing tips to fact checking. A single Gemini text query emits 0.03 grams of carbon dioxide equivalent and consumes about five drops of water, according to Google.
Google, which included its methodology in the report and the formulas it used, says other tech companies don’t disclose the energy demands of their power-hungry AI models with this much detail, but it is hoping that will change. “As a community, as an industry, we’re not very consistent about how we measure energy,” said Parthasarathy Ranganathan, who helped conduct Google’s research.
But the tech giant also appears to be looking to ease brewing anxieties about AI searches: that frequently using generative AI such as Gemini can be detrimental to the environment. The Google report points out that its Gemini energy analysis is lower than public estimates.
This summer Google said in an environmental report that its emissions have jumped 51% since 2019 because of AI needs. But the AI-focused report from August says Google’s AI systems are becoming more efficient, and that the energy used to power a median Gemini text prompt is 33 times less today than it was 12 months ago.
Global demand for AI is ramping up rapidly. Electricity demand from data centers worldwide is set to more than double by 2030 to about 945 terawatt-hours, which is more than Japan’s total electricity consumption, according to the International Energy Agency.
“A single AI-focused data center can use as much electricity as a small city and as much water as a large neighborhood,” the Union of Concerned Scientists says on its website.
A data center that fuels AI can consume as much electricity as 100,000 households , but the largest ones that haven’t been completed yet could consume 20 times more as much. It’s a particular problem in the U.S., with data centers making up nearly half of its electricity demand growth over the next five years, according to the IEA.
There’s also been heightened concern recently about the amount of water that is required to cool electrical equipment in data centers. Just a few weeks ago, French company Mistral AI released a report detailing the environmental footprint of training its language model Mistral Large 2, including the amount of water it consumes. The water consumption from generating one page of text is 0.05 liter, enough to grow a small radish, the report says.
OpenAI Chief Executive Sam Altman said in a recent blog post that “people are often curious about how much energy a ChatGPT query uses.” In response, he said the average query uses about the amount an oven would use in just over one second, and one-fifteenth of a teaspoon of water. But he didn’t include his methodology.
There are ways to reduce the power demands of AI, for example power capping, which slows down an AI response by just a few milliseconds for non-urgent requests. There are a range of ways to reduce water use for data centers, including using recycled water or water that isn’t safe to drink.
The type of query you feed to generative AI also matters. Energy demands can be dampened “if you can remove some of that back and forth, and make your prompt a little simpler and easier to understand,” said Vijay Gadepally, a senior scientist at the Massachusetts Institute of Technology, where he researches AI and environmentally friendly computing. He said it’s better to use such tools sparingly. A study from Unesco said that shorter, more concise prompts, along with using smaller AI models, can dramatically reduce energy use.
The energy use of one company’s generative AI when responding to a standard question might look completely different from that of another business, Gadepally said. A tech company might say that, “for a typical query, it uses this many kilowatt-hours. Well, what is your definition of a typical query?” he said.
Tech giants are announcing many new clean-energy power agreements to fuel their AI ambitions, including Google, which recently announced new power deals from geothermal to hydropower. Earlier this week, the tech company announced an advanced nuclear reactor project in Tennessee with Kairos Power.
But there are political headwinds marring the tech industries’ plans. Big tech companies—the largest purchasers of clean energy, who are also facing pressure to meet their carbon emission reduction goals—have pushed the Trump administration not to cut clean energy subsidies.
It’s important for tech companies to also divulge how frequently their AI is receiving queries, Gadepally said.
“If it’s being used by one person, emissions are lower, but that’s different if it’s billions of people at 30 data centers across the world,” he said.
Google noted this, too.
“While the impact of a single prompt is low compared to many daily activities, the immense scale of user adoption globally means that continued focus on reducing the environmental cost of AI is imperative,” the report said.
Write to Clara Hudson at clara.hudson@wsj.com