Within six years, building the leading AI data center may cost $200B

0
6


Data centers to train and run AI may soon contain millions of chips, cost hundreds of billions of dollars, and require power equivalent to a large city’s electricity grid, if the current trends hold.

That’s according to a new study from researchers at Georgetown, Epoch AI, and Rand, which looked at the growth trajectory of AI data centers around the world from 2019 to this year. The co-authors compiled and analyzed a data set of over 500 AI data center projects and found that, while the computational performance of data centers is more than doubling annually, so are the power requirements and capital expenditures.

The findings illustrate the challenge in building the necessary infrastructure to support the development of AI technologies in the coming decade.

OpenAI, which recently said that roughly 10% of the world’s population is using its ChatGPT platform, has a partnership with Softbank and others to raise up to $500 billion to establish a network of AI data centers in the U.S. (and possibly elsewhere). Other tech giants, including Microsoft, Google, and AWS, have collectively pledged to spend hundreds of millions of dollars this year alone expanding their data center footprints.

According to the Georgetown, Epoch, and Rand study, the hardware costs for AI data centers like xAI’s Colossus, which has a price tag of around $7 billion, increased 1.9x each year between 2019 and 2025, while power needs climbed 2x annually over the same period. (Colossus draws an estimated 300 megawatts of power, as much as 250,000 households.)

Epoch AI data center study
Image Credits:Epoch AI

The study also found that data centers have become much more energy efficient in the last five years, with one key metric — computational performance per watt — increasing 1.34x each year from 2019 to 2025. Yet these improvements won’t be enough to make up for growing power needs. By June 2030, the leading AI data center may have 2 million AI chips, cost $200 billion, and require 9 GW of power — roughly the output of 9 nuclear reactors.

It’s not a new revelation that AI data center electricity demands are on pace to greatly strain the power grid. Data center energy intake is forecast to grow 20% by 2030, according to a recent Wells Fargo analysis. That could push renewable sources of power, which are dependent on variable weather, to their limits — spurring a ramp-up in non-renewable, environmentally damaging electricity sources like fossil fuels.

AI data centers also pose other environmental threats, such as high water consumption, and take up valuable real estate, as well as erode state tax bases. A study by Good Jobs First, a Washington, D.C.-based nonprofit, estimates that at least 10 states lose over $100 million per year in tax revenue to data centers, the result of overly generous incentives.

It’s possible that these projections may not come to pass, of course, or that the time scales are off-kilter. Some hyperscalers, like AWS and Microsoft, have pulled back on data center projects in the last several weeks. In a note to investors in mid-April, analysts at Cowen observed that there’s been a “cooling” in the data center market in early 2025, signaling the industry’s fear of unsustainable expansion.

LEAVE A REPLY

Please enter your comment!
Please enter your name here