What happened
Artificial intelligence (AI) model development, particularly large language models, now prioritises electricity availability over chip advancements, introducing a significant constraint on operational scaling. Training these models requires high-performance computing infrastructure, utilising thousands of GPUs continuously for weeks or months. This demand causes AI-focused data centres to consume electricity equivalent to 100,000 households, with overall data centre electricity consumption projected to more than double by 2030 due to AI. Consequently, technology firms are now developing proprietary power infrastructure, shifting the burden of energy supply.
Why it matters
This escalating electricity demand introduces a critical operational constraint on infrastructure and energy procurement teams, increasing the oversight burden for ensuring reliable and sustainable power. The shift to proprietary power infrastructure raises due diligence requirements for assessing energy sources, including natural gas, renewables, and nuclear, and managing associated environmental impacts. This also increases exposure for compliance and sustainability teams to potential policy mismatches regarding energy consumption and greenhouse gas emissions, as the industry seeks diverse solutions to meet AI's growing energy needs.




