Data centers powering artificial intelligence could use more electricity than entire cities
The power needs of artificial intelligence and cloud computing are growing so large that individual data center campuses could soon use more electricity than some cities, and even entire U.S. states, according to companies developing the facilities.
The electricity consumption of data centers has exploded along with their increasingly critical role in the economy in the past 10 years, housing servers that power the applications businesses and consumers rely on for daily tasks.
Now, with the advent of artificial intelligence, data centers are growing so large that finding enough power to drive them and enough suitable land to house them will become increasingly difficult, the developers say. The facilities could increasingly demand a gigawatt or more of power — one billion watts — or about twice the residential electricity consumption of the Pittsburgh area last year.
Technology companies are in a "race of a lifetime to global dominance" in artificial intelligence, said Ali Fenn, president of Lancium, a company that secures land and power for data centers in Texas. "It's frankly about national security and economic security," she said. "They're going to keep spending" because there's no more profitable place to deploy capital.
Renewable energy alone won't be sufficient to meet their power needs. Natural gas will have to play a role, developers say, which will slow progress toward meeting carbon dioxide emissions targets.
( See here for which stocks are helping to fix the nation's power grid .)
Regardless of where the power comes from, data centers are now at a scale where they have started "tapping out against the existing utility infrastructure," said Nat Sahlstrom, chief energy officer at Tract, a Denver-based company that