FROM THE FRONTIER
Google shares its receipts on AI energy consumption — and busts a major myth
Here’s something interesting to think about. For all the chatter around AI’s environmental footprint, how much energy do you think a single text prompt uses? Hint: the number probably isn’t as big as some are claiming.
Efforts to understand AI’s energy footprint have largely been stonewalled, mostly due to a lack of transparency from major tech companies. Now, Google has finally pulled back the curtain with a detailed report on just how much energy its Gemini models consume.
Here’s what it found: A typical text query burns through about 0.24 watt-hours of electricity — about as much as running a microwave for one second. Each text prompt also uses about five drops of water and pumps out 0.03 grams of carbon dioxide — figures that are “substantially lower” than many public estimates, says the company.
What makes this study unique is its “full stack” approach. Unlike other studies that zero in on the AI models themselves, Google accounts for everything — from the chips running the AI to the cooling systems keeping data centers operational. It’s the most transparent estimate yet from a Big Tech company with a popular AI product.
There are some important caveats, though. Google’s figure only represents the energy required for a single prompt, not the total energy that went into training the underlying AI model (which can be a lot). Some queries, like generating images and videos, can also use a lot more power. While Google’s transparency and efficiency gains are a welcome step forward, the industry still has some way to go if it’s to keep up with AI’s increasing energy demands.
via: Superhuman
