_________________________________________________.thumb_head

Photo/Lan Suying (NBD)

The AI revolution is sweeping across the globe at an unprecedented pace, driving technological progress and economic growth. However, behind this technological revolution, a neglected challenge is quietly approaching - tight electricity supply.

"There won't be enough electricity to run all the chips next year," warned Elon Musk. According to The New Yorker, OpenAI's ChatGPT consumes as much electricity as that of 17,000 American households every day. This shocking figure further reveals AI's huge demand for electricity.

In its latest research report, Merrill Lynch paints an even more urgent picture: AI electricity consumption in the US is expected to grow at a compound annual growth rate of 25%-33% between 2023 and 2028. The CEO of chip design company Arm even stated that AI could consume a quarter of the US's electricity by 2030.

In the face of this challenge, the US power system is under unprecedented pressure. The transformation of the public power grid system is imminent, but progress is slow. In order to compete for limited grid access, companies have extended their competition to areas that are not technologically advanced, such as Ohio and Iowa.

In this context, exploring new power supply solutions has become crucial. Clean energy technologies such as nuclear fusion, fuel cells, and geothermal energy are being seen by companies as potential safeguards for future power supply.

US Power Grid Under Pressure: ChatGPT Consumes Electricity of 17,000 Homes Per Day

"I think there won't be enough electricity to run all the chips next year." Elon Musk made this prediction in March this year, highlighting the risk of power shortages behind the AI boom.

The training and inference of large AI models require a huge amount of power supply. According to The New Yorker, OpenAI's ChatGPT responds to about 200 million requests per day, consuming more than 500,000 kilowatt-hours of electricity, equivalent to the electricity consumption of 17,000 American households. "The end of computing is electricity," this may be a joke, but it has its profound meaning.

As the leader of the AI industry, the United States has already felt the pressure of AI demand on the power grid. Tech entrepreneur and former Google engineer Kyle Corbitt recently revealed on social media that Microsoft engineers training GPT-6 are busy connecting GPUs located in different regions, which is a very difficult task, but they have no choice because if 100,000 Nvidia H100 chips are concentrated in one area, it will cause a grid collapse.

At present, major technology giants in the United States are actively deploying AI, deploying data centers and other infrastructure across the country. Grid access has become an important competitive resource. According to The Washington Post, traditionally unrelated to the computing industry, such as Columbus, Ohio, Altoona, Iowa, and Fort Wayne, Indiana, have become data center developers' battlegrounds for land and grid access.

"Everyone is chasing electricity now, and they are willing to go anywhere." Andy Cvengros, managing director of data center markets at real estate services firm Cushman & Wakefield, told The Washington Post. According to Cvengros, power companies generally say that they need to assess their own system capacity in the face of the sudden surge in electricity demand before they know if they can cope.

Recently, data center developers Michael Halaburda and Arman Khalili are busy converting an abandoned tile factory in the Portland area into a data center. Just a few months ago, they thought electricity was not a problem, but recently the power company reminded them that they need to conduct "line and load studies" to assess whether it could supply the facility with 60 megawatts of electricity — roughly the amount needed to power 45,000 homes.

According to data from the International Energy Agency, there were about 2,700 data centers in the United States in 2022, accounting for about 4% of the country's total electricity consumption. By 2026, this proportion will reach 6%.

As AI's demand for chips grows and the cooling needs of higher-performance chip clusters, the power demand of the AI industry will only increase. Rene Haas, CEO of chip design company Arm, recently expressed concern that if chip efficiency is not improved, data centers will consume up to 20% to 25% of US electricity demand by 2030.

Merrill Lynch pointed out in its research report that the expansion of AI data centers is just one factor driving the growth of US electricity demand. Other drivers include industrial growth, the popularity of electric vehicles, and the electrification of buildings.

Merrill Lynch forecasts that US electricity demand will grow at a compound annual growth rate (CAGR) of 2.8% from 2023 to 2030, among which AI's electricity consumption will grow at a CAGR of 25% to 33% from 2023 to 2028, which will undoubtedly bring challenges to the design and operation of data centers.

Facing Bottlenecks, US Companies Explore Innovative Ways to Secure Power

Amidst the surging demand for electricity, the US power infrastructure is in dire need of expansion and upgrade to cope with the concentrated power demands brought about by events like AI training. Most of the US power grid infrastructure was built in the 1960s and 1970s, and the majority of transmission lines and transformers have been in operation for over 25 years, making them prone to blackouts caused by extreme weather. The Texas power outage during the 2021 winter storm serves as a stark reminder.

Against this backdrop, the Federal Energy Regulatory Commission (FERC) of the United States will hold a special meeting on May 13th to discuss power transmission issues. It is understood that the meeting will formulate plans to accelerate the development of long-distance transmission line projects and the access of clean energy projects to the grid.

In reality, alleviating the power bottleneck is one of the Biden administration's priorities. However, building a national grid involves interstate issues, and the federal government has limited power. Additionally, it involves a large amount of land acquisition, environmental assessments, and cost negotiations between states, which is ultimately a long and complicated process. Data shows that in 2013, the United States could still build 4,000 miles of transmission lines per year, while today that number is less than 1,000 miles.

Faced with the reality that the public grid cannot be significantly improved in the short term, businesses have begun to find their own solutions to secure their own power supply.

Purchasing nuclear power is one of the main directions. In March 2024, Amazon Web Services (AWS) signed an agreement to purchase a data center next to a nuclear power plant in Pennsylvania and has the potential to receive up to 960 megawatts of dedicated power from the plant.

In addition, building small nuclear reactors for dedicated power supply is another option.

Microsoft opened a recruitment notice in 2023, seeking a nuclear technology expert to evaluate and integrate small modular nuclear reactors and micro-reactors "to power data centers that support Microsoft Cloud and AI." Moreover, Microsoft is also making strides in nuclear fusion, signing a power purchase agreement with nuclear fusion startup Helion in May 2023 to purchase power from Helion in 2028. It is worth noting that OpenAI CEO Sam Altman is an early and important investor in the startup.

However, there is currently no small modular nuclear reactor in the United States, partly due to the lengthy independent regulatory process of the federal agency Nuclear Regulatory Commission. And generating electricity using nuclear fusion is still an unrealized scientific goal.

The aforementioned data center developers Michael Halaburda and Arman Khalili turned to more mature technologies such as fuel cells and geothermal energy. Their Portland data center project will be primarily powered by natural gas-fired fuel cells, with grid electricity serving as a supplement. In another new project in southern Texas, they even chose not to connect to the grid, instead digging hundreds of meters to generate electricity for the data center using geothermal power.

Editor: Alexander