How much energy does AI really use? The answer is surprising – and a little complicated


rob dobi/Getty Images

AI feels inescapable. It’s everywhere: Your smartphone, Google, work tools. AI features promise to make life easier and more productive – but what exactly is the environmental impact of a quick chatbot query?

As AI adoption continues to grow, so do the technology’s energy costs. Made up of high-compute systems, AI requires a lot of data, which needs to be stored on large networks of computers known as data centers. Just like your personal computer, those gigantic centers need electricity — as does the process of training an AI model, which relies on more compute than traditional computer functions. 

Also: How much energy does a single chatbot prompt use? This AI tool can show you

But in the context of the energy we already use every day, from office lights and laptops to social media, how does that consumption actually compare? Can the technology’s resource needs change or be improved over time? Is the time it supposedly saves worth the extra emissions? And what should you know about your personal AI footprint? 

We spoke with experts and researchers to help explain how AI really uses energy and answer your sustainability questions, complete with tips on what you can do. 

What is a data center?

AI needs more resources to function than other kinds of technology. The amount of data AI systems ingest and the computing power required to run them set them apart from simpler computer tasks. An AI system is effectively a synthetic brain that needs to be fed billions of pieces of data in order to find the patterns between them. This is why larger-parameter models tend to be better at certain tasks — an image model trained on four billion images of cats, for example, should produce a more realistic image of a cat than one trained on just 100 million. 

But all that knowledge needs to live somewhere. What you’ve heard described as “the cloud” is not an airy name for storage, but a physical data center, or a large campus that houses expansive networks of computers that process and store huge amounts of data and run complex queries.  

Also: AI data centers are becoming ‘mind-blowingly large’

While these large computing farms have always existed, primarily for enterprise cloud services, they’re in more demand than ever as the AI race intensifies — and as the tools themselves get cheaper and more accessible. 

“You have big companies that have been managing those as real estate assets,” said John Medina, an SVP at Moody’s. “Everyone only needed a little bit; they didn’t need a ton of capacity.” 

Now, he said, the pressure is on to serve a rapidly growing customer base.

That demand is driving up energy use, and the more parameters a model has, the more compute it’s using, said Vijay Gadepally, a senior staff member at MIT’s Lincoln Laboratory and CTO at Radium, an AI infrastructure company. “You need more computing just to even store the model and be able to process it.” 

With investment in AI only gaining speed, data center growth shows no signs of stopping. Shortly after taking office in January, President Donald Trump announced Project Stargate, a $500-billion initiative supported by companies including OpenAI, Softbank, and Oracle to build “colossal,” 500,000-square-foot data centers. These companies are known as hyperscalers, a small but dominant group of corporations like Microsoft, Google, Meta, and AWS that are building the lion’s share of infrastructure.

Also: The future of computing must be more sustainable, even as AI demand fuels energy use

However, Medina noted that the hype cycle may be inflating how much data center growth is AI-specific. “When we talk about hyperscalers, large data centers, AI data centers, we get confused. Most of it is for the cloud,” he said, referring to services like storage and data processing. He noted that despite all the chatter, data centers are only processing a relatively small number of AI-related tasks.

That said, the AI boom is shifting base standards in ways that make relativism harder to pin down. “In the past, you didn’t have a huge need like this. Four megawatts were considered hyperscale,” Medina said. “Now, 50, 100 megawatts is that minimum.”

How much energy does AI use?

As Sasha Luccioni, Ph.D., AI and climate lead at developer platform Hugging Face, admitted in a recent op-ed, we still don’t really know how much energy AI consumes, because so few companies publicize data about their usage.

However, several studies indicate energy consumption is on the rise, nudged along by a growing demand for AI. A 2024 Berkeley Lab analysis found that electricity consumption has grown exponentially in tandem with AI in recent years. GPU-accelerated servers – hardware specifically used for AI – multiplied in 2017; a year later, data centers made up nearly 2% of total annual US electricity consumption, and that number was growing annually by 7%. By 2023, that growth rate had jumped to 18%, and is projected to hit as much as 27% by 2028. Even if we can’t splice how much data center energy is being spent on AI, the trend between more consumption and AI expansion is clear. 

Also: How your inefficient data center hampers sustainability – and AI adoption

Boston Consulting Group estimates that data centers will account for 7.5% of all US electricity consumption by 2030, or the equivalent of 40 million US homes.

Mark James, interim director of the Institute for Energy and the Environment at Vermont Law and Graduate School, offered another comparison. A large facility running at full capacity uses 1,000 megawatts per hour – “the same size as the peak demand of the state of Vermont — 600,000+ people — for months,” he noted.

Currently, global data centers use about 1.5% of the world’s electricity, which is about the same as the entire airline industry. It’s likely to surpass it; an April 2025 IEA report found that globally, data center electricity use has gone up 12% every year since 2017, which is “more than four times faster than the rate of total electricity consumption.” Data centers, directly or indirectly propelled by AI, are starting to take up more space in the world’s energy landscape, even as other energy usage appears to stay mostly the same.

For some, that’s reason to worry. “This is going to be a carbon problem very quickly if we’re scaling up power generation,” Gadepally warned. 

Want more stories about AI? Sign up for Innovation, our weekly newsletter.

Others aim to put these numbers in context. While there’s evidence AI is driving up energy costs, research also shows global energy consumption overall is on the rise. Newer data centers and GPUs are also more energy efficient than their predecessors, meaning they may create relatively less carbon. “These 100, 200-megawatt massive builds are using the most efficient technology — they’re not these old power guzzlers that the older ones are,” Medina said. Even as data centers multiply, their predicted consumption curve may start to level out thanks to modern technology. 

Within AI energy use, not all types of AI share the same footprint. We don’t have access to energy consumption data for proprietary models from companies like OpenAI and Anthropic (as opposed to open-source models). However, across all models, generative AI — especially image generation — appears to use more compute (and therefore create more emissions) than standard AI systems. 

An October 2024 Hugging Face study of 88 models found that generating and summarizing text uses more than 10 times the energy of simpler tasks like classifying images and text. It also found that multimodal tasks, in which models use image, audio, and video inputs, are “on the highest end of the spectrum” for energy use. 

Does one ChatGPT query really use a bottle of water? 

When it comes to specific comparisons, research is all over the map on the resources AI uses. One study determined that asking ChatGPT to write a 100-word email uses an entire bottle of water — a claim that’s quickly circulated on social media. 

But is it true?

“It’s possible,” said Gadepally. He pointed out that GPUs generate a lot of heat; even when being cooled by other methods, they still require water cooling as well. “You’re using something like 16 to 24 GPUs for that model that may be running for 5 to 10 minutes, and the amount of heat that’s generated, you can start to kind of do the math,” he said.

These systems don’t just use any kind of water, either – they need clean, high-quality, potable water running through them. “These pipes, they don’t want to clog them up with anything,” Gadepally explained. “Many data centers are in areas with stressed watersheds, so that’s something to keep in mind.” 

New methods like immersion cooling, in which processors are immersed in a liquid-like mineral oil, show some promise for reducing water use and energy consumption compared to other cooling methods like fans. But the tech is still developing, and would need to be widely adopted to make an impact. 

Also: The best AI image generators of 2025: Gemini, ChatGPT, Midjourney, and more

With proprietary data still murky, there are several other comparisons out there for how much energy chatbot queries use. Jesse Dodge, a researcher from nonprofit institute Ai2, has compared one ChatGPT query to the electricity used to power one light bulb for 20 minutes. 

The Hugging Face study noted that “charging the average smartphone requires 0.022 kWh of energy, which means that the most efficient text generation model uses as much energy as 9% of a full smartphone charge for 1,000 inferences, whereas the least efficient image generation model uses as much energy as 522 smartphone charges (11.49 kWh), or around half a charge per image generation.” 

According to Gadepally, an AI model processing a million tokens — roughly a dollar in compute costs — emits about as much carbon as a gas-powered car does while driving five to 20 miles. But energy use also varies widely depending on the complexity of the prompt you’re using. “Saying ‘I want a short story about a dog’ will likely use less compute than ‘I would like a story about a dog that’s sitting on a unicorn written in Shakesperean verse,'” he said. 

If you’re curious about how your individual chatbot queries use energy, Hugging Face designed a tool that estimates the energy consumption of queries to different open-source models. Green Coding, an organization that works with companies to track the environment impact of their tech, designed a similar tool

How does AI’s energy consumption compare to other tech?

While it’s true that overall energy consumption appears to be increasing in part due to AI investment, researchers urge users to see energy consumption as relative. 

The metric that one ChatGPT query uses 10 times as much energy as a Google search has become standard, but is based on the now-outdated 2009 Google estimate that one Google search consumes 0.3 Watt-hours (Wh) of energy. It’s hard to say whether that number has gone up or down today based on changes to the complexity of Google searches or increased chip efficiency.

Either way, as data scientist and climate researcher Hannah Ritchie pointed out, that 0.3 Wh of energy needs to be put in perspective — it’s relatively small. She noted that in the US, average daily electricity usage is about 34,000 Wh per person. Using the outdated Google metric, a ChatGPT prompt is just 3 Wh; even with multiple queries a day, that’s still not a huge percentage. 

07baf773-ff62-4719-b07c-8e4df6669f23-1600x924

Researchers compare the energy costs of using ChatGPT relative to individual daily energy consumption.

Plus, tech that doesn’t explicitly use AI already uses lots of data center bandwidth. 

“What are the hottest digital applications today? TikTok, Instagram Reels, YouTube searches, streaming, gaming — all of these things are hosted from the cloud,” said Raj Joshi, another analyst and SVP at Moody’s.

Also: The best AI image generators of 2025: Gemini, ChatGPT, Midjourney, and more

He and Medina added that as AI features integrate with everything from gaming to enterprise tech, it’s becoming increasingly hard to attribute specific energy demands to AI or non-AI applications. 

Within AI, however, model needs are evolving. “It’s quite significant,” Gadepally said of the energy increase compared to earlier in the technology’s history. He noted that inference — when a model makes predictions after it’s been trained — now accounts for much more of a model’s lifetime cost. “That wasn’t the case with some of the original models, where you might spend a lot of your effort training this model, but the inference is actually pretty easy — there wasn’t much compute that needed to happen.” 

Is using ChatGPT bad for the environment? 

Because AI has become inextricably tied up in existing technology, experts say it’s difficult to determine its specific impact. Whether to use it or not may come down to individual judgment more than hard numbers. 

“From a sustainability perspective, you have to balance the output of the AI with the use of the AI,” Medina said. “If that output is going to save you time that you would have your lights on, your computer on, and you’re writing something that takes you an hour, but [AI] can do it in five minutes, what’s the trade-off there? Did you use more energy taking 30 minutes to write something that they can write you in one minute?”

Also: How AI hallucinations could help create life-saving antibiotics

To Medina’s point, AI can also be used to advance research and technology that helps track climate change in faster, more efficient ways. Ai2 has launched several AI tools that help collect planetary data, improve climate modeling, preserve endangered species, and restore oceans. Referencing data from the Sustainable Production Alliance, AI video company Synthesia argues that AI-generated video produces less carbon than traditional methods of video production, which rely on travel, lighting, and other resource-intensive infrastructure. 

Regardless, parts of the industry are responding to concerns. In February, Hugging Face released the AI Energy Score Project, which features standardized energy ratings and a public leaderboard of where each model stands in its estimated consumption. 

Are there greener alternatives for AI?

Across the industry, organizations are exploring ways to improve AI sustainability over time. At MIT’s Lincoln Lab, Gadepally’s team is experimenting with “power-capping,” or strategically limiting the power each processor uses to below 100% of its capacity, which reduces both consumption and GPU temperature. Chinese AI startup DeepSeek achieved a similar outcome by being more efficient with how it runs and trains its models, though they are still quite large. 

That approach can only go so far, though. “No one’s figured out how to make a smaller model suddenly do better on high-quality image generation at scale,” Gadepally said.

Also: What is sparsity? DeepSeek AI’s secret, revealed by Apple researchers

Because he doesn’t see demand for AI waning — especially with on-device phone features multiplying — Gadepally said efficiency and optimizing are solutions for now. “Can I improve my accuracy by one and a half percent instead of one percent for that same kilowatt hour of energy that I’m pumping into my system?” 

He added that switching data centers to just run on renewable energy, for example, isn’t that easy, as these sources don’t turn on and off as immediately as natural gas, a requirement for large-scale computing. But by slowing the growth curve of AI’s consumption with tactics like power capping, it becomes easier to eventually replace those energy sources with renewable ones — like replacing your home lightbulbs with LEDs

To move towards sustainability, he suggested companies consider being flexible about where they’re doing compute, as some areas may be more energy efficient than others, or training models during colder seasons, when demands on a local energy grid are lower. An added benefit of this approach is that it helps lower processor temperatures without significantly impacting model performance, which can make their outputs more reliable. It also reduces the need for cooling using potable water. Benefits like this, as well as a resulting cost-effectiveness, are incentives for companies to make sustainability-forward changes.

Gadepally believes companies have the right intentions toward sustainability; he thinks it’s a question of whether they can implement changes fast enough to slow environmental damage. 

Should you use AI if you care about the environment?

If you’re worried about how your AI use impacts your carbon footprint, it’s not so simple to untangle. Avoiding AI tools might not help reduce your carbon footprint the way other lifestyle choices can. 

Andy Masley, director of advocacy group Effective Altruism DC, compared the impact of asking ChatGPT 50,000 fewer questions (10 questions every day for 14 years) to other climate-forward actions from philanthropic network Founders Pledge.

f1791eb6-63a6-497e-9ab4-ddcd8eab3c0e-1456x893

The results are pretty minuscule. “If individual emissions are what you’re worried about, ChatGPT is hopeless as a way of lowering them,” Masley wrote. “It’s like seeing people who are spending too much money, and saying they should buy one fewer gumball per month.” 

“It saves less than even the ‘small stuff’ that we can do, like recycling, reusing plastic bags, and replacing our lightbulbs,” Ritchie added in a Substack post referencing Masley. “If we’re fretting over a few queries a day while having a beef burger for dinner, heating our homes with a gas boiler, and driving a petrol car, we will get nowhere.”

Also: The best AI chatbots of 2025: ChatGPT, Copilot, and notable alternatives

In the big picture, Masley and Ritchie are concerned that focusing on AI energy consumption could distract well-intentioned users from larger, more pressing climate stressors. 

Gadepally agreed that abstaining from AI only gets you so far. “In this day and age, it’s almost like saying, ‘I’m not going to use a computer,'” he said. Still, he has a few suggestions for improving the future of AI energy use and creating more transparency around the subject. Here are a few approaches you can try: 

Demand transparency from providers

With the right data, firms like Gadepally’s can at least generate estimates of how much energy AI is using. Individuals can organize to ask AI companies to make this information public. The AI playing field is only getting more competitive; he said that theoretically, as with any other social value, if enough users indicate they care about the sustainability of their tools, it could become a market mover. 

Speak up during procurement processes 

Sustainability is often already a consideration in many corporatation-level decisions, especially when businesses are weighing vendors and services. Gadepally believes in the power of applying that culture to AI. If your business is licensing AI tools, he suggests asking for energy usage and sustainability data during negotiations. 

“If large companies demand this on multi-million dollar contracts that are working with account executives, that can get very far,” he pointed out, as they already do for other line items like work travel. “Why wouldn’t you ask about this, where it really does add up pretty quickly?” 

Use the smallest possible model

Be intentional about the quality of the model you choose for a query relative to your needs. “Almost every provider has multiple versions of the model — we tend to use probably the highest quality one that we have access to,” which can be wasteful, Gadepally noted. “If you’re able to get away with something smaller, do that.” 

As part of this, Gadepally encourages users to accept getting imperfect results more often. Back-and-forth prompt refinement, for example, can be done with a lower-quality model; once you perfect your prompt, you can try it with a more expensive, higher-parameter model to get the best answer. 

In addition to these goals, Michelle Thorne, director of strategy at The Green Web Foundation – a nonprofit “working towards a fossil-free internet” – urged tech companies to phase out fossil fuels across their supply chains and take steps to reduce harms when mining for raw materials. 

What comes next? 

The industry at large is responding to sustainability questions with initiatives like the Frugal AI Challenge, a hackathon at the 2025 AI Action Summit, which took place in Paris this past February. Google said in its sustainability goals that it intends to replenish 120% of the freshwater it consumes across its offices and data centers by 2030. 

Some argue that the bigger-is-better approach in AI may not actually yield more value or better performance, citing diminishing returns. 

Also: Why neglecting AI ethics is such risky business – and how to do AI right

Ultimately, however, regulation will likely prove more effective in standardizing expectations and requirements for tech companies to manage their environmental impact, within and beyond their use of AI. 

Long-term, AI expansion (and the costs that come with it) shows no signs of stopping. “We have sort of an insatiable appetite for building more and more technology, and the only thing that keeps you limited has been cost,” Gadepally said — a nod to Jevons Paradox, or the idea that efficiency only begets more consumption, rather than satisfaction. 

For now, AI’s energy future is unclear, but the tech industry at large is an increasingly significant player in a climate landscape marked by skyrocketing demand and very little time. 

Get the morning’s top stories in your inbox each day with our Tech Today newsletter.





Source link

Leave a Comment