Sustainable AI: Counting the Environmental Footprint of Generative Models

sustainable-ai environment energy responsible-ai

Sustainable AI: Counting the Environmental Footprint of Generative Models

Generative AI is transforming industries, but it comes with a growing environmental cost. Training and running large models requires significant energy, and that energy has a carbon footprint.

As AI becomes more central to business operations, organisations need to understand and address this impact. This article breaks down the environmental costs and outlines practical steps for building more sustainable AI systems.

The scale of the problem

Large language models and other generative AI systems are computationally intensive. The environmental impact occurs at multiple stages.

Training a large model can consume enormous amounts of energy. Some estimates suggest that training a single large language model can emit hundreds of tonnes of CO2 equivalent, comparable to the lifetime emissions of several cars. But training is only part of the story. Every time someone uses a generative AI model, it requires computation. At scale, inference can consume more energy over a model’s lifetime than training. As adoption grows, so does this impact.

The infrastructure matters too. Data centres that power AI require cooling, which adds to energy consumption. The hardware itself has embodied carbon from manufacturing, and water usage for cooling is significant in some regions. GPUs and specialised AI chips have finite lifespans, and manufacturing them is energy-intensive. Disposal creates electronic waste.

Why this matters for organisations

Environmental impact is increasingly material for businesses. Governments are introducing requirements for carbon reporting and sustainability disclosures, and AI systems may need to be included in these reports. Customers, employees and investors increasingly care about environmental performance, and visible sustainability efforts can be a competitive advantage.

There are practical benefits too. Energy costs money. More efficient AI systems are also cheaper to run. Sustainability and cost efficiency often align. And dependence on energy-intensive systems creates exposure to energy price volatility and potential supply constraints.

Measuring your AI footprint

Before you can reduce impact, you need to understand it. Document which AI systems you are running, the models, hardware and usage patterns across your organisation. Consider where computation is happening—cloud providers vary significantly in their energy sources and efficiency, and on-premises infrastructure may differ again. Track how often models are used, because inference at scale can dwarf training costs. And understand the energy mix: computation powered by renewable energy has lower carbon impact than that powered by fossil fuels.

Many cloud providers now offer carbon footprint dashboards. Use them. For on-premises infrastructure, work with facilities teams to understand energy sources and consumption.

Practical steps for sustainable AI

Organisations can take concrete actions to reduce their AI environmental footprint. Not every task needs the largest model—smaller, more efficient models often perform comparably for specific tasks while consuming far less energy. Choose the right size model for each use case.

Techniques like quantisation, pruning and distillation can reduce model size and computational requirements without proportional performance loss. Invest in optimisation before scaling deployment. Use hardware optimised for your workloads, since newer generations of AI accelerators are often significantly more efficient than older ones. Batch requests where possible, cache common outputs, and avoid running large models for tasks that simpler approaches could handle.

Choose cloud regions or data centres powered by renewable energy. Many providers now offer green computing options. Track energy consumption and carbon emissions from AI systems, include AI in your sustainability reporting, and set targets to measure progress. Ask your cloud and AI providers about their sustainability practices and favour those that demonstrate commitment to reducing environmental impact.

The business case for sustainable AI

Investing in sustainable AI is not just about ethics. It makes business sense. Efficient models and infrastructure cost less to run. Less dependence on energy-intensive systems reduces exposure to energy price shocks. Demonstrated sustainability commitment can attract customers, employees and investors. And as regulations tighten, early action avoids scrambling later.

What leaders should do

If you are responsible for AI strategy, commission an assessment of your AI environmental footprint. Include sustainability criteria in AI procurement and development decisions. Set targets for reducing AI energy consumption and carbon emissions. Train teams on sustainable AI practices. Report on AI environmental impact alongside other sustainability metrics. And stay informed about emerging standards and best practices.

The organisations that take sustainability seriously now will be better positioned as expectations and regulations evolve.

The bottom line

Generative AI delivers real value, but it comes with environmental costs that cannot be ignored. Understanding these costs, measuring them accurately, and taking practical steps to reduce them is both responsible and strategically sound. Sustainable AI is not a constraint on innovation. It is an opportunity to build better, more efficient systems.

Ready to Build Your AI Academy?

Transform your workforce with a structured AI learning programme tailored to your organisation. Get in touch to discuss how we can help you build capability, manage risk, and stay ahead of the curve.

Get in Touch