Skip to main content

The Grid Is Growing Again

AI data centres now consume 90 terawatt-hours per year, roughly 10x the 2022 figure. AI-optimised racks demand 30-100 kW each. The hunger for power is so extreme it has restarted growth in a stagnant US electrical grid.
8 January 2026·8 min read
John Li
John Li
Chief Technology Officer
Mak Khan
Mak Khan
Chief AI Officer
For two decades, the US electrical grid was essentially flat. Efficiency gains offset demand growth. That era ended in 2024. AI data centres now consume 90 terawatt-hours per year, roughly 10x the 2022 figure. Individual AI-optimised racks demand 30-100 kW, up from the 5-15 kW of traditional servers. The hunger for compute is so extreme it has restarted growth in a national power grid that most analysts assumed had plateaued permanently.

Executive Summary

  1. AI power consumption has grown an order of magnitude in four years. Data centre AI workloads hit 90 TWh/year by early 2026, up from approximately 9 TWh in 2022. A single AI training cluster now draws more power than a small city.
  2. Rack density is forcing infrastructure redesign. AI-optimised racks demand 30-100 kW each, compared to 5-15 kW for traditional compute. The response: 800-volt DC distribution systems and liquid cooling as baseline, not optional.
  3. The energy industry is responding with bets that were science fiction five years ago. Microsoft signed a power purchase agreement with Helion Energy for 50 MW of fusion power by 2028. This is a commercial contract, not a research grant. The AI industry's demand for clean power is pulling forward energy technologies by decades.
90 TWh
annual AI data centre power consumption by 2026
Source: IEA, Data Centres and Energy Report, 2025; industry estimates
30-100 kW
power demand per AI-optimised rack (vs 5-15 kW traditional)
Source: Uptime Institute, Data Centre Infrastructure Survey, 2025
50 MW
fusion power Microsoft contracted from Helion Energy by 2028
Source: Microsoft / Helion Energy PPA announcement, 2023
~100 tonnes
CO2 captured by Climeworks' Mammoth plant in first operational year
Source: Climeworks operational data, 2025

The Scale of the Problem

To put 90 TWh in context: that is roughly equal to the entire annual electricity consumption of Belgium. And it is growing. Projections from the IEA and independent analysts suggest AI data centre power demand could reach 150-200 TWh by 2028, depending on training run sizes and inference scaling.
AI Data Centre Power Consumption (TWh/year)
Source: IEA, Data Centres and Energy Report, 2025; industry estimates
The per-rack numbers tell the engineering story. A traditional enterprise server rack draws 5-15 kW. Enough to run a few homes. An AI training rack with eight GPUs draws 30-50 kW. A dense inference rack can hit 100 kW. A single row of these racks needs its own substation.
Power Per Rack: Traditional vs AI-Optimised
Source: Uptime Institute, Data Centre Infrastructure Survey, 2025
This is not a gradual trend. GPU generations are increasing power draw faster than efficiency gains can offset. NVIDIA's B200 draws more than its H100 predecessor, which drew more than the A100 before it. Each generation delivers more compute per watt, but the total wattage keeps climbing because customers demand more total compute.

The Cooling Equation

Air cooling cannot handle 100 kW racks. The physics are straightforward: moving enough air to dissipate that heat requires so much fan power and floor space that it defeats the purpose.
Liquid cooling has become baseline infrastructure for new AI data centres. Direct-to-chip liquid cooling, where coolant flows directly over GPU dies, is now standard in high-density deployments. Immersion cooling, where entire servers sit in dielectric fluid, is moving from experimental to production.
The shift to 800-volt DC distribution inside data centres mirrors what happened in electric vehicles. Higher voltage means lower current for the same power delivery, which means thinner cables, less resistive loss, and more efficient power conversion. It is practical engineering driven by raw demand.

The Energy Bets

The AI industry's power appetite is pulling forward energy technologies that seemed decades away.
Fusion. Microsoft's power purchase agreement with Helion Energy for 50 MW of fusion power by 2028 is a commercial contract with financial penalties for non-delivery. Helion's approach (pulsed field-reversed configuration) is one of several fusion designs that have attracted serious investment since 2023. The contract is significant not because fusion is guaranteed to work on schedule, but because a major technology company is willing to bet on it commercially.
Direct air capture. Climeworks' Mammoth plant in Iceland was designed for 36,000 tonnes of CO2 removal per year. In its first operational year, it captured approximately 100 tonnes. The gap between design capacity and operational output is enormous. The cost sits at $600-$1,000+ per tonne, far above the levels needed for economic viability.
But the trajectory matters. Project Cypress in Louisiana aims for 1 million tonnes per year. Engineered bacteria that consume CO2 and produce jet fuel precursors have moved from laboratory curiosity to pilot-scale demonstration. These are early-stage technologies on steep learning curves.
Fusion contracts, direct air capture, 800-volt DC distribution - these were conference talk topics three years ago, and now they are procurement line items. The AI power problem is forcing the energy industry to innovate faster than it has in decades.
John Li
Chief Technology Officer

What This Means for NZ

New Zealand's position is unusual. The country runs approximately 80-85% renewable electricity already, primarily from hydro and geothermal. Microsoft's NZ North cloud region operates on 100% renewable energy through its Contact Energy partnership.
This creates a genuine competitive advantage for AI workloads that require clean power credentials. As carbon reporting requirements tighten globally, running AI inference on NZ's renewable grid becomes a differentiator, not just an environmental claim.
The constraint is capacity. NZ's total electricity generation is approximately 43 TWh per year. A single large-scale AI training facility could consume a measurable percentage of that. Significant AI infrastructure deployment in NZ would require new generation capacity, likely geothermal or wind, with lead times measured in years.

The Engineering Reality

For enterprise engineering teams, the grid story has practical implications:
Cloud cost volatility. As power costs rise for data centre operators, those costs pass through to cloud pricing. AI inference costs are already the fastest-growing line item for enterprises with production AI systems. Budget for 15-25% annual increases in compute costs.
Latency and sovereignty trade-offs. Running AI workloads in NZ on renewable energy is clean, but NZ data centres will not match the raw capacity of US or European facilities for large-scale training. The practical split: inference and fine-tuning in NZ, large-scale training offshore.
Sustainability reporting. The grid story cuts both ways. AI delivers productivity gains and creates a carbon footprint. Enterprises will increasingly need to report both sides. Understanding where your AI workloads run, and on what power source, is becoming a governance requirement.
The grid is growing again. The AI industry broke a 20-year plateau in US power demand and is now the single largest driver of new electricity generation investment globally. For engineers, this is the context in which every architecture decision sits.