How much energy does a single chatbot prompt use? This AI tool can show you

AI systems require a lot of energy to function, but no one has exact numbers, especially not for individual chatbot queries. To address this, an engineer at Hugging Face built a tool to try to find out.
Also: The top 20 AI tools of 2025 – and the #1 thing to remember when you use them
The language surrounding AI infrastructure, much of which emphasizes “the cloud” and other air-themed metaphors, can obscure the fact that it relies on energy-hungry computers. To run complex computations quickly, AI systems require powerful chips, multiple GPUs, and expansive data centers, all of which consume power when you ask ChatGPT a question. This is part of why free-tier access to many chatbots comes with usage limits: electricity costs make computing expensive for the hosting company.
Chat UI Energy
To demystify some of this, Hugging Face engineer Julien Delavande built an AI chat interface that shows real-time energy use estimates for your conversations. It compares how much energy various models, tasks, and requests use — for example, a prompt that requires reasoning is likely to use more energy than a simple fact-finding query. In addition to Watt-hours and Joules, the tool shows usage in more accessible metrics, such as the percentage of a phone charge or driving time, using data from the Environmental Protection Agency (EPA).
Asking Hugging Face’s chatbot (running Qwen/Qwen2.5-VL-7B-Instruct) about the weather used about 9.5% of a phone charge.
Screenshot by Radhika Rajkumar/ZDNET
When I asked Chat UI Energy about the weather in New York City, the first comparison it showed me was for a phone charge (my query used about 9.5%). When I clicked on that estimate, the tool toggled through other equivalent comparisons, including 45 minutes of LED bulb use, 1.21 seconds of microwave use, and 0.15 seconds of toaster energy. As you continue chatting, the bot shows the total energy usage and time of the conversation at the bottom of the chat window.
Also: Why I just added Gemini 2.5 Pro to the very short list of AI tools I pay for
Though my query was very simple, it relied on access to the internet, which the bot doesn’t have. That may be why it took 90 seconds (and more energy than expected) to return a response. Still, even as an estimate, 45 minutes of LED bulb use seems anecdotally high, which puts the energy used by much more complex, multi-step prompts into perspective.
Only AI companies know how much energy their systems really use, but studies estimate that, based on demand trends, it’s only increasing. A 2024 International Energy Agency report predicts electricity demand will increase globally by 3.4% — a faster rate than usual — by 2026, driven in part by “a notable expansion” of data centers. A Berkeley Lab report also found data centers to be accelerating, with an expected growth rate of “13% to 27% between 2023 and 2028.”
The release emphasizes the distinction between open-source platforms like Hugging Face and more opaque AI companies.
Also: Copilot just knocked my AI coding tests out of the park (after choking on them last year)
“With projects like the AI Energy Score and broader research on AI’s energy footprint, we’re pushing for transparency in the open-source community,” the chat’s creators said in the announcement. “One day, energy usage could be as visible as nutrition labels on food!”
How to try it yourself
You can try the chatbot here and toggle through several open-source models, including Google Gemma 3, Meta’s Llama 3.3, and Mistral Nemo Instruct.
Want more stories about AI? Sign up for Innovation, our weekly newsletter.