While companies building artificial intelligence (AI)-powered supercomputers are likely to be able to get the chips and capital they need, they may run into power constraints by 2030, according to research institute Epoch AI.
“If the observed trends continue, the leading AI supercomputer in June 2030 will need 2 million AI chips, cost $200 billion and require 9GW of power,” the company said in a paper posted Wednesday (April 23). “Historical AI chip production growth and major capital commitments like the $500 billion Project Stargate suggest the first two requirements can likely be met. However, 9GW of power is the equivalent of nine nuclear reactors, a scale beyond any existing industrial facilities.”
To overcome this challenge, companies may shift to decentralized training approaches that would allow them to distribute their training across AI supercomputers in several locations, according to the paper.
Epoch AI said this in a paper in which it reported that since 2019, AI supercomputers’ computation performance grew 2.5 times per year, while their power requirements and hardware costs doubled each year.
The computational performance of the AI supercomputers has been driven by the use of more and better chips, the report said. Since 2019, chip quantity has increased 1.6 times per year and chip performance has increased 1.6 times per year.
“While systems with more than 10,000 chips were rare in 2019, several companies deployed AI supercomputers more than 10 times that size in 2024, such as xAI’s Colossus with 200,000 AI chips,” the paper said.
Epoch AI said in the paper that it found that the share of AI supercomputers’ computing power owned by companies rather than the public sector rose from 40% in 2019 to 80% in 2025.
The company also found that 75% of AI supercomputers’ computing power is hosted in the United States, while the second-largest share, 15%, is hosted in China.
xAI launched its Colossus 100k H100 training cluster in September, with owner Elon Musk saying at the time that Colossus would double in size to 200k (50k H200s) within months.
Stargate, which was announced by President Donald Trump in January, aims to build big AI-focused data centers in the U.S., starting with 10 in Texas. The first will be 500,000 square feet in size.
The post AI Supercomputers May Run Into Power Constraints by 2030 appeared first on PYMNTS.com.