Site icon eDiscovery Today by Doug Austin

How Much Energy Does an AI Prompt Use? Google Provides an Estimate: Artificial Intelligence Trends

How Much Energy

How much energy does an AI prompt use? Google may have provided the most transparent estimate yet on energy use of AI.

As discussed in MIT Technology Review (In a first, Google has released data on how much energy an AI prompt uses, written by Casey Crownhart and available here), last week, Google released a technical report detailing how much energy its Gemini apps use for each query. In total, the median prompt—one that falls in the middle of the range of energy demand—consumes 0.24 watt-hours of electricity, the equivalent of running a standard microwave for about one second. The company also provided average estimates for the water consumption and carbon emissions associated with a text prompt to Gemini.

It’s the most transparent estimate yet from a Big Tech company with a popular AI product, and the report includes detailed information about how the company calculated its final estimate. As AI has become more widely adopted, there’s been a growing effort to understand its energy use. But public efforts to directly measure the energy used by AI have been hampered by a lack of full access to the operations of a major tech company.

Advertisement

Google’s figure, however, is not representative of all queries submitted to Gemini: The company handles a huge variety of requests, and this estimate is calculated from a median energy demand, one that falls in the middle of the range of possible queries. For example, feeding dozens of books into Gemini and asking it to produce a detailed synopsis of their content is “the kind of thing that will probably take more energy than the median prompt,” Jeff Dean, Google’s chief scientist says. Using a reasoning model could also have a higher associated energy demand because these models take more steps before producing an answer. So, it’s a start.

Rather than using an emissions estimate based on the US grid average, or the average of the grids where Google operates, the company instead uses a market-based estimate, which takes into account electricity purchases that the company makes from clean energy projects. The company has signed agreements to buy over 22 gigawatts (Great Scott!) of power from sources including solar, wind, geothermal, and advanced nuclear projects since 2010. Because of those purchases, Google’s emissions per unit of electricity on paper are roughly one-third of those on the average grid where it operates.

AI data centers also consume water for cooling, and Google estimates that each prompt consumes 0.26 milliliters of water, or about five drops.

Of course, Google doesn’t share the total number of queries that Gemini gets each day, which would allow estimates of the AI tool’s total energy demand. The report isn’t that informative. But it’s a start…I guess? 🤔

Advertisement

So, what do you think? Do you think this report sheds light on how much energy is being used by AI? Or does it leave a lot of important information out? Please share any comments you might have or if you’d like to know more about a particular topic.

Image created using Microsoft Designer, using the term “robot lawyer looking at one molecule of energy”.

Disclaimer: The views represented herein are exclusively the views of the authors and speakers themselves, and do not necessarily represent the views held by my employer, my partners or my clients. eDiscovery Today is made available solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscovery Today should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Exit mobile version