Artificial intelligence is transforming the world at lightning speed — and it’s taking our power grids along for the ride.
According to a new report from the International Energy Agency (IEA), global electricity consumption by data centers is set to more than double by 2030, fueled largely by the massive energy demands of AI technologies.
The Paris-based agency’s latest findings paint a complex picture: while AI opens the door to more efficient ways of generating and using electricity, it also introduces serious challenges for global energy security and climate goals.
In its first report focused specifically on AI and energy, the IEA reveals that data centers consumed around 1.5% of the world’s electricity in 2024, with that number rising by about 12% each year over the past five years.

By the end of this decade, data centers are projected to consume approximately 945 terawatt-hours (TWh) — more than the current electricity usage of Japan.
That surge is largely due to the growing popularity of generative AI, which requires extraordinary computing power to process data stored in vast digital warehouses. The lion’s share of this demand is currently concentrated in the United States, Europe, and China, which together account for about 85% of data center energy consumption.
Tech giants are already bracing for the coming energy crunch.
- Google struck a deal last year to power its AI operations with electricity from small nuclear reactors.
- Microsoft plans to source energy from newly constructed reactors at the historic Three Mile Island site, infamous for its 1979 nuclear incident.
- Amazon has also signed agreements to tap into nuclear energy for its expanding fleet of data centers.

At the current trajectory, data centers could consume around 3% of global electricity by 2030. To put that into perspective, a single 100-megawatt data center can use as much energy as 100,000 households — and newer facilities could consume as much power as two million homes, according to the IEA.
Despite the energy burden, the IEA emphasizes AI’s dual potential: while it drives up demand, it can also help streamline energy systems, cut costs, boost competitiveness, and lower emissions. Smart grids, predictive maintenance, and real-time energy optimization are just a few areas where AI could make a positive impact.
Still, emissions are expected to rise. Carbon output from data center electricity use could jump from 180 million tonnes of CO2 today to around 300 million tonnes by 2035 — a relatively small fraction of the 41.6 billion tonnes of global emissions recorded in 2024, but a noticeable increase nonetheless.

To stay competitive and energy-resilient, the U.S. has launched a new initiative called the National Council for Energy Dominance, aimed at scaling up power generation — part of an effort led by President Donald Trump to stay ahead of China in the AI race.
Coal still accounts for about 30% of electricity used in data centers, but that share is expected to shrink as renewables and natural gas become cheaper and more accessible in major markets.
The IEA’s key message? The future of AI is not just about algorithms — it’s also about energy. Striking a balance between innovation and sustainability will be crucial as we power forward into the next phase of the digital era.
Join the conversation:
Do you think AI’s benefits will outweigh its energy costs? How can tech companies make their AI efforts more sustainable? Share your thoughts below!
Discover more from Scoop Hub
Subscribe to get the latest posts sent to your email.