Blog | Meeting the Energy Demands of the AI Boom
March 6th, 2025
As tech giants like Alphabet, Amazon, Microsoft, and Meta collectively invest billions in AI-related projects, the demand for computing power is soaring. This exponential growth highlights the significant relationship between AI and energy consumption. As AI technology advances and adoption accelerates, its energy demands are increasing rapidly. Recent reports indicate that data center developers are frequently requesting “several gigawatts” [ref] of power from utilities to support their AI workloads. This surge in energy demand raises serious concerns about the strain it places on the electrical grid.
AI has become a part of everyday life, with large language-based models like ChatGPT being widely adopted for tasks ranging from research to creative writing. These AI systems rely on energy-intensive Graphics Processing Units (GPUs) from two main activities: training models (20% of energy use) and executing user requests (80%). According to the World Economic Forum, the computational power required to sustain AI is doubling approximately every 100 days [ref]. By 2028, AI’s energy consumption could surpass the total electricity used by Iceland in 2021 [ref]. Furthermore, achieving a tenfold improvement in AI model efficiency could lead to a 10,000-fold increase in computational power demand. Currently, AI’s energy needs are growing at an annual rate of 26% to 36%, with estimates suggesting it could consume between 85 and 134 terawatt-hours (TWh) annually by 2027 [ref]. By 2030, AI could account for 3-4% of global electricity demand, comparable to the energy consumption of countries like Russia or Japan.
This rapid increase in demand for AI-led energy consumption has caused a surge in requests for new interconnections to power the data centers that host AI’s computing infrastructure. These facilities, which require hundreds of megawatts (MWs) of power – effectively the output of a small power station – are adding significant pressure to the already strained grid.
The growing electricity demands of AI data centers will require a combination of infrastructure upgrades and innovative solutions. To meet the growing demand while remaining committed to the energy transition, utilities will need multiple tools to support the growth of renewable energy generation and advanced energy storage systems, ensuring a consistent supply clean electricity. The most common tool is to build new transmission infrastructure to efficiently connect renewable energy sources to demand centers.
However, smart technologies such as flexible interconnections are being deployed to maximize the capacity of existing and new infrastructure, reducing costs and speeding up construction timelines. Flexible Interconnections support data center growth controlling flexible resources within the data centers like uninterruptible power supply, generators, BESS, or PV to provide utilities the flexibility required to ride through potential reliability issues. This level of flexibility benefits data center operators and utilities by allowing them to interconnect the data center faster and begin earning revenue sooner. Modern flexibility solutions are also evolving demand-response systems allowing for more targeted electricity flows, especially during peak demand. Together, these strategies aim to create a resilient and sustainable energy framework capable of meeting the growing demands of AI data centers.
In summary, managing AI’s energy demand is a complex challenge that requires multiple solutions. While traditional wires upgrades are the end goals, utilities may risk losing significant demand customers if they cannot deliver these in fast times. Smart technologies provide a bridge to wires solutions that help utilities provide a service to their data center customers. These solutions are a win-win for both the data center and the utility as the solutions reduce the timelines to start earning revenue.