AI boom thrusts Europe between power-hungry data centers and environmental goals
The boom in artificial intelligence is ushering in an environmentally conscious shift in how data centers operate, as European developers face pressure to lower the water temperatures of their energy-hungry facilities to accommodate the higher-powered chips of firms such as tech giant Nvidia.
AI is estimated to drive a 160% growth in demand for data centers by 2030, research from Goldman Sachs shows — an increase that could come at a cost to Europe's decarbonization goals, as the specialized chips used by AI firms are expected to hike the energy use of the data centers that deploy them.
High-powered chips — also known as graphics processing units, or GPUs — are essential for training and deploying large language models, which are a type of AI. These GPUs need high density computing power and produce more heat, which ultimately requires colder water to support reliable cooling of the chips.
AI can consume 120 kilowatts of energy in just one square meter of a data center, which is equivalent to the power consumption and heat dissipation of around 15 to 25 houses, according to Andrey Korolenko, chief product and infrastructure officer at Nebius, who referred specifically to the deployment of Nvidia's Blackwell GB200 chip.
"This is extremely dense, and from the cooling standpoint of view you need different solutions," he said.
Michael Winterson, chair of the European Data Center Association (EUDCA), warned that lowering water temperatures will eventually "fundamentally drive us back to an unsustainable situation that we were in 25 years ago."
"The problem we've got with the chipmakers is [that] AI is now a space race run by the American market where land rights, energy access and sustainability are relatively low on the pecking