The rapid advancement of artificial intelligence is significantly increasing electricity demand. Research from Epoch AI indicates that by 2030, advanced AI supercomputers could require power equivalent to nine nuclear power plants, which is roughly the electricity consumption of 7 to 9 million homes.
If supercomputer power needs continue to double annually, approximately 9GW of power will be necessary by 2030. Currently, the most powerful supercomputers need about 300MW, which is comparable to the energy consumption of 250,000 homes. For instance, xAI's Colossus, which cost an estimated $7 billion to build, utilizes 200,000 chips.
OpenAI announced the Stargate supercomputer project, with investments exceeding $100 billion, aimed at constructing key AI infrastructure. Nvidia is also investing heavily, with plans to invest up to $500 billion over the next four years to build AI infrastructure in the U.S. These investments highlight the shift of supercomputers from research tools to industry engines that generate economic value.
Despite improvements in energy efficiency, AI supercomputers are expected to drive higher power demand. Companies like Microsoft and Google are exploring nuclear energy as a potential alternative to meet these growing energy needs.