How Artificial Intelligence is Driving Up Electricity Demand

Artificial intelligence is reshaping the tech landscape, significantly increasing the power consumption of data centers. A University of California, Berkeley study projects this could account for up to 12% of the U.S.'s total electricity use by 2020.
The Remarkable Surge in Data Center Electricity Consumption
The modern era is witnessing a rapid rise in the energy requirements of U.S. data centers. A recent study by UC Berkeley highlights an “accelerating increase in electricity consumption by American data centers”.
How AI is Shaping Our Electricity Usage
This significant surge is largely due to advancements in Artificial Intelligence (AI) technology. Starting in 2017, data centers began increasingly integrating AI-driven GPU servers into their infrastructure. Consequently, the total electricity usage of these centers rose, accounting for 1.9% of the total annual electricity consumption in 2018 and increasing to 4.4% by 2023.
What to Expect in the Coming Years
Forecasts for 2028 predict that data centers could consume between 325 to 580 TWh of electricity, representing 6.7 to 12% of the total U.S. electricity consumption. The rise of AI-specific servers may further accelerate this trend.
It’s important to note that during the early to mid-2010s, the almost constant energy consumption was maintained by the efficiency of hyperscale and colocation data centers. Although they accounted for nearly 80% of the total server energy usage by 2023, these centers are poised to play an increasingly vital role.
Future Predictions
Based on these projections, the future of AI could be heavily influenced by infrastructure dominance, a trend that is already raising concerns. Will we have sufficient energy sources to power the data centers needed for increasingly energy-intensive AI training? Only time will tell.