The Secret Energy of AI

Every time you ask a generative AI tool a question, generate an image, or run a search powered by machine learning, something happens that most people never think about: energy is consumed. Quite a lot of it. The rapid growth of artificial intelligence has brought extraordinary capabilities to businesses and consumers alike, but it has also placed a significant and growing demand on electricity systems around the world. For organisations working toward net zero targets and energy compliance, understanding the hidden energy cost of AI is becoming increasingly important.
The Data Centre Behind Every Query
AI does not run on the device in your hand. It runs in data centres – vast, warehouse-scale facilities packed with tens of thousands of specialised computer servers that process requests around the clock. These facilities require enormous amounts of electricity not only to power the servers themselves, but also to run the cooling systems that prevent the hardware from overheating. A single large data centre can consume as much electricity as a small town.
According to the International Energy Agency (IEA), global data centre electricity consumption reached approximately 415 terawatt-hours (TWh) in 2024, accounting for around 1.5% of the world’s total electricity use.1 That figure has been growing steadily, but with the explosion of AI-focused infrastructure, it is now accelerating at a much faster pace. The IEA’s analysis projects that data centre electricity demand will more than double by 2030, reaching around 945 TWh.2
Why AI Uses So Much Energy
AI systems are energy-intensive for two distinct reasons: training and inference. Training is the process by which an AI model learns from vast datasets. It involves running billions of calculations repeatedly over days or weeks, consuming substantial amounts of electricity in the process. Training a single large language model can produce carbon emissions comparable to several transatlantic flights.
Inference – the act of using a trained model to generate a response – is less energy intensive per query, but the sheer volume of requests adds up quickly. Every search, chatbot interaction, image generation, or recommendation algorithm running on AI infrastructure requires continuous power. As AI tools become embedded into everyday business software, the cumulative energy demand grows at a pace that is difficult to overstate.
AI-focused data centres are also more power-hungry per unit than traditional ones. They typically run on graphics processing units (GPUs) and purpose-built AI chips, which deliver far greater computational performance but also draw significantly more power per server rack. Where a conventional server rack might consume around 5 to 10 kilowatts, AI-optimised racks can draw 100 kilowatts or more.
The Carbon and Water Footprint
The carbon impact of this energy use depends heavily on the electricity sources powering data centres. Where the grid is dominated by fossil fuels, the emissions can be substantial. Research published in 2025 estimated that the carbon footprint of AI systems alone could reach between 32.6 and 79.7 million tonnes of CO₂ in 2025.3 For context, that upper estimate is comparable to the annual emissions of a medium-sized European country.
There is also a water dimension that is rarely discussed. Data centres rely on water-cooled systems to manage heat, and AI infrastructure is particularly demanding in this respect. The IEA estimated global data centre water consumption at 560 billion litres in 2023.2 Research by De Vries-Gao suggests this may understate the true picture, calculating that AI systems alone could consume between 312.5 and 764.6 billion litres – a volume comparable to global annual bottled water consumption.3 The IEA projects total data centre water consumption could rise to around 1,200 billion litres by 2030 as facilities expand to meet growing AI demand.2,4 In a context where water scarcity is an increasing concern, this is a significant consideration for any organisation assessing its environmental impact.
What This Means for UK Organisations
For UK businesses, the implications of AI’s energy appetite are becoming a practical concern across several areas. Organisations with sustainability reporting obligations – including those subject to SECR (Streamlined Energy and Carbon Reporting) – need to understand where AI sits within their energy and carbon footprint. While the energy consumed by third-party data centres may fall under Scope 3 emissions rather than a company’s direct consumption, the direction of travel in reporting standards is toward greater transparency across the full value chain.
There is also a domestic infrastructure angle. The UK is seeing a surge in data centre construction, particularly around London and the South East. By 2035, data centres could account for as much as 20% of the UK’s projected total CO₂ emissions if growth continues unchecked.6 This is placing new pressures on the national grid and on electricity networks in areas with high concentrations of facilities, which in turn affects the energy costs and supply reliability for businesses in those regions.
Can AI Be Part of the Solution?
The picture is not entirely bleak. AI also has the potential to contribute meaningfully to energy efficiency and decarbonisation efforts. Machine learning is already being used to optimise grid operations, improve demand forecasting, reduce waste in industrial processes and enhance the performance of renewable energy systems. Some of the largest technology companies operating data centres have made significant commitments to renewable energy procurement and are investing in next-generation cooling technologies designed to reduce both energy and water use.
The IEA notes that efficiency improvements in AI hardware and software have been significant, with some AI tasks now requiring far less energy than they did only a few years ago.2 However, the pace of adoption is outstripping these efficiency gains. The net result, in the near term at least, is that total energy demand continues to rise even as the energy cost per query falls. The challenge is to ensure that the transition to cleaner energy keeps pace with this expanding demand.
A Growing Consideration for Energy Strategy
For businesses looking to manage their energy use and carbon footprint, AI is no longer something that can be considered entirely separate from energy strategy. As AI tools are adopted more widely – from customer service platforms to procurement systems to building management – the energy implications of those tools become part of the operational picture.
Organisations that are proactive about understanding their energy and carbon footprint across all sources – including digital infrastructure – will be better placed to respond to evolving reporting requirements, demonstrate progress against net zero commitments and make more informed procurement decisions when selecting technology platforms. Asking questions about a technology supplier’s energy sourcing and efficiency commitments is a reasonable and increasingly standard part of due diligence.
How 2EA Can Help
At 2EA, we help organisations understand, measure and reduce their energy use and carbon emissions. As the energy landscape shifts, the need for clear, practical energy management has never been greater. Whether you are working toward ESOS compliance, developing a carbon reduction plan, or simply looking to get a better handle on your organisation’s energy consumption, our team can provide the expert guidance you need.
References
- Brookings Institution – Global energy demands within the AI regulatory landscape (2026).
- International Energy Agency – Energy and AI: Executive Summary (2025).
- De Vries-Gao, A. et al. – The carbon and water footprints of data centers and what AI means for them (ScienceDirect, 2025).
- Impakter – Is Water Usage in AI Data Centres Sustainable? (2026).
- UK Government Sustainable ICT Blog – AI’s thirst for water (2025).
- UNISON Magazine – Is AI green? (2026).