Today : Aug 20, 2025
Technology
19 August 2025

AI Energy Boom Strains Grids And Sparks Urgent Rethink

Soaring electricity demand from AI data centers exposes weaknesses in aging infrastructure and challenges existing energy policies worldwide.

The world is in the throes of an artificial intelligence (AI) revolution, and the effects are reverberating far beyond the tech industry. As AI and generative AI technologies become increasingly embedded in everyday life, from search engines to app stores, the demand for data and, crucially, the electricity needed to process it, has reached unprecedented levels. According to reporting from Pure Storage and the International Energy Agency (IEA), data centers consumed about 4.4% of total U.S. electricity in 2023—a figure projected to surge to somewhere between 6.7% and 12% by 2028. That’s a staggering leap in just five years, and it’s only the tip of the iceberg.

McKinsey & Co forecast that global data center capacity demand for generative AI workloads will increase at a compound annual growth rate (CAGR) of 39% by 2030, while other workloads will see a 16% CAGR. This explosive growth is mirrored in the physical world: a recently announced hyperscaler data center is almost the size of Manhattan. The hundreds of millions of people now using AI are, in energy terms, equivalent to millions of new homes being plugged into the grid, as noted by recent coverage from IE’s columnist Ni Tao and other industry watchers.

This surge in demand is straining electrical grids that, in many places, were never designed for such loads. Power grids across Europe, for example, are 30 to 80 years old, stretching up to 1,500 miles in length, and were built for the needs of the 20th century. They’re now being pushed to their limits by 21st-century technologies, with international connections growing as countries buy and sell power to balance local fluctuations. In China, as Ni Tao reports, the rapid rise in AI computing power has led to an unprecedented acceleration in data center construction. High-power servers, once niche, are now standard fixtures, and cooling technologies are being pushed to their limits to manage the heat generated by these AI workloads.

So, where does all this energy come from? Today’s power generation is a mix of renewables (solar, wind, hydro), nuclear, and fossil fuels. But getting electricity from source to socket is a complex, expensive journey. The grid is a patchwork of old and new, often unable to keep up with the demands of both homes and data centers. According to Pure Storage, the “last mile” problem is acute—substations built for yesterday’s factories and homes can’t handle the flood of electrons needed for today’s data-driven world. Some organizations are experimenting with superconducting carbon fiber cables to boost capacity, but these innovations are still years away from widespread use.

Against this backdrop, the rise of AI is creating a “square peg/round hole” dilemma for both industry and government. As Pure Storage puts it, “the industry is behind the curve of demand.” GPU manufacturers are shipping hundreds of thousands of GPUs every quarter, with the latest models consuming roughly 30 kilowatt-hours per day—the same as a typical four-person household. Multiply that by the scale of deployment, and it’s clear why electricity demand is spiraling.

Policy, too, is struggling to keep pace. The past decade has been dominated by the so-called Green Agenda, with tax breaks and subsidies for solar and wind energy, while fossil fuels have been penalized and nuclear energy has faced reputational attacks, especially from politicians and environmental groups. Yet, as one recent analysis put it, “the rise of AI presents a challenge that requires pragmatism rather than idealism.” The current policy frameworks in most advanced economies simply cannot accommodate the energy needs of AI at scale.

Nowhere is this tension more visible than in the efforts to keep data centers online. In the UK, the government has designated data centers as Critical National Infrastructure, ensuring they remain operational even in the face of attacks or malicious activity. This reflects a growing recognition that data is as vital as water or electricity in the modern economy. Yet, as more data centers come online, the pressure on national grids intensifies. Already, there are well-documented cases of power shortages in places like West London and Dublin, where local grids can’t keep up.

Cooling is another major challenge. Historically, water cooling was common in data centers, but many operators have switched to traditional chillers and cooling strategies powered by electricity. This means even more power is diverted from computation to cooling, further increasing overall demand. In China, cooling technologies are being pushed to the brink as high-power servers become the norm, according to Ni Tao’s reporting.

Mining for metals is also ramping up to meet the needs of new grids and electrification. In 2022 alone, 2.8 billion tons of metals were mined globally, including 2.6 billion tons of iron, 69 million tons of aluminum, and 22 million tons of copper. But experts warn that 10 to 100 times more copper will be needed to build out the grids and infrastructure required for a fully electrified, AI-powered world.

What’s to be done? The answer isn’t as simple as just generating more power. The grid itself needs a massive overhaul. Manufacturing, food production, and industry are still heavily dependent on fossil fuels, and these sectors will need to be reinvented to align with new energy realities. Load shedding and brownouts are already a reality in some countries, and the risk of regions running out of power is real.

On the industry side, tech leaders are being called to account. Pure Storage notes that the IT industry “has had a pass in recent years. That must change.” The company advocates for greater transparency in reporting power consumption and embedded carbon at both device and fleet levels, a focus on energy-efficient technologies, and building a circular economy to reuse equipment. Pure Storage itself has improved its single system capability from 5TB a decade ago to 6PB today—a more than 1,200-fold increase—while making systems physically smaller and less power-hungry. If cars had improved at the same rate, we’d be driving around the earth in ten minutes on a single tank of fuel!

Innovation will be key. Short-term solutions like small modular nuclear reactors could help meet energy needs, since they’re relatively simple to deploy and can be online in a few years rather than decades. In the longer term, embracing state-of-the-art components for compute, networking, storage, and cooling, as well as software innovations like enhanced compression and deduplication algorithms, will be critical. Even novel technologies like ceramic data storage are being explored as part of the solution.

Ultimately, the data and AI boom is not going to slow down. Consumer and business demand is only growing, and the only way to meet these needs is to invest in innovation, embrace change, and be sensible with the resources available. As the world races to keep up with AI’s insatiable appetite for energy, the challenge will be to balance progress with sustainability, and pragmatism with ambition.