AI is transforming how companies work. It's also changing their carbon footprint in ways many organizations are only beginning to understand.
The real-world impact from AI’s explosive growth is nuanced: while grids are getting greener, AI workloads are growing faster than renewable energy can keep pace. But companies have tools and strategies available to understand and reduce the emissions of their own AI use—and push vendors towards more sustainable and transparent practices.
Where AI emissions actually come from
When we talk about AI's environmental impact, we're mostly talking about the emissions from powering large language models—generative AI tools, which are increasingly used across enterprises in every sector. LLMs run almost exclusively on GPUs (graphics processing units), a specialized type of electronic circuit.
About 81% of AI-related emissions come from the electricity used to power GPUs as they run AI workloads. Another 18% come from data center electricity for cooling and operations, while 1% come from GPU manufacturing.
Within that GPU electricity usage, there's a distinction between training new models and inference, or the ongoing use of those models: "AI providers have spent a large sum of money and a large sum of electricity training these models, but most of the energy of the lifespan of these models come from inference," says Steven Watson, Watershed’s Head of Measurement.
Inference means every query, every response, and every interaction; this accounts for more than 90% of GPU electricity use, while training accounts for less than 10%.
Put AI emissions in context
For most companies using third-party cloud providers, AI emissions fall into scope 3, category 1, purchased goods and services. Scope 3 is where the majority of companies' emissions already live—typically 80-90% of their total footprint. Many of the strategies you'd use to reduce other scope 3 emissions apply to reducing AI emissions.
Three ways to minimize AI's footprint
Leading companies are approaching AI sustainability from multiple angles. Based on work at organizations like Autodesk and Okta, here are strategies that are working:
1. Use AI efficiently and intentionally
Alison Colwell, Head of Sustainability and Responsible Technology at Okta, spent eight months developing internal guidance on sustainable AI use (more on Okta’s Sustainable AI strategy here). The framework centers on three principles:
When to use AI: "We're an AI-forward company. We secure AI agents for our customers and we're using AI tools for productivity and innovation. We want to use AI smartly, efficiently, meaningfully," Colwell explains. Okta’s focus is on deploying AI where it drives the greatest impact.
Choosing the right model: "Choosing the lightest model for the use case, that's really important, because it can have energy savings and can also have cost savings," says Colwell. "And it sometimes can be faster and more accurate if you have the model that's the right for your use case and limiting information to what you need."
Newer models are often more efficient, but right-sizing the model for the job at hand matters too. The biggest model isn't always the best fit for a given use case—as Watershed’s climate scientist Shaena Ulissi puts it, “don’t use a Hummer when a bicycle will do.”
Prompting efficiently: How you prompt generative AI affects both energy use and cost. The goal should be to get the information you need as quickly as possible.
2. Partner with your cloud efficiency team
Autodesk recognized that sustainability and cloud efficiency teams share aligned goals. "If we can run our cloud in the most efficient and cost-saving way possible, we'll be optimizing resource use and also saving emissions," says Jessica Mi, a Senior Program Analyst on the Sustainable Operations & ESG team at Autodesk.
This partnership unlocked real results. Autodesk's Translation Avoidance Initiative—which changed compute-intensive translation services from a default action to on-demand—saved 15% in both cost and associated emissions. "Being able to enrich that cloud efficiency story with emissions savings helps strengthen the narrative and emphasize that impact with engineering teams and leadership," Mi explains.
Other optimization strategies include:
- Running servers in regions with greener grids. Note: This is only available for companies accessing LLMs through Cloud-hosted models (Azure, AWS, Google Cloud).
- Scheduling energy-intensive tasks during off-peak hours or weekends when grid carbon intensity is lower.
- Optimizing workloads and server usage to maximize efficiency.
3. Use procurement to drive efficiency
"Sustainable procurement is a very exciting opportunity," says Watson. "We get to use procurement as the reward for things that we value. For sustainability, these can be things like disclosure, reporting, and progress."
When evaluating AI vendors, questions to ask include:
- Data availability: Do they report corporate emissions and, more critically, provide specific metrics like energy (kWh) per token or per query (a key efficiency metric)? Do they split out market-based vs. location-based accounting (this will determine the accuracy of their emissions estimates)? Do they provide third-party assurance on their emissions reporting?
- Additionality: How are they procuring renewable energy? What is the Power Usage Effectiveness (PUE) of the data centers handling your workload? Are they investing in on-site renewables? Buying into VPPAs? How much impact does that renewable energy program actually drive?
- Targets: What are they committing to in terms of renewable energy capacity and carbon removals?
Companies can go a step further—publicizing their intent to choose vendors that prioritize sustainability and transparency. Sustainability teams can be leaders here and apply those principles when procuring AI-powered software. Here’s a sample RFP for evaluating sustainability AI tools.
Build visibility to drive action
Grids are getting greener, but data centers are being built rapidly to keep pace with AI growth. The race between these trends will determine the emissions impact of AI, and at the moment data centers growth is moving faster than clean energy projects can connect to the grid. As a result, according to Goldman Sachs projections, 60% of new data center power this decade will likely come from natural gas rather than renewables. This means we can’t count on the greening of the grid to counteract the emissions expected from the growth of AI.
This urgency is underlined by the Jevons Paradox: even as individual models become more efficient (Google’s Gemini saw a 33x energy reduction per query in a year), the overall proliferation of AI is projected to cause total energy demand to triple by 2030. This means the rate of adoption is overwhelming the rate of efficiency gains. Visibility and control are therefore more crucial than ever.
What companies can do:
- Expand renewable energy programs to cover AI tools and choose AI suppliers that have science-based sustainability commitments.
- Engage vendors to push for transparency and efficiency improvements.
- Build better visibility into where AI emissions come from.
The companies making progress are integrating AI sustainability into existing cloud optimization efforts, embedding it into procurement decisions, and empowering teams across the organization with data.
Impact starts with information. As Colwell puts it: "The two goals really are to have better information to empower our teams, like engineers and employees, on where we are consuming a lot of energy, and hopefully insights on how we can be more sustainable."

