Back to insights

Cooling Data Centres for an AI World

Data centres face a growing challenge: cooling increasingly dense IT loads efficiently, sustainably and at scale. Few predicted today’s reliance on immersion and chip level cooling and the next leap forward driven by AI, Edge computing and rising IT power densities. What are 5 trends shaping an energy conscious world?

As our lives become increasingly digital, data centres are the backbone of everything from video streaming to cloud computing to artificial intelligence. But this backbone is getting hot – literally. The rise in IT loads, driven by AI, high-definition media and real time data processing, is creating unprecedented demand for efficient cooling. Meeting this demand requires more than just new hardware – it calls for innovation in design, integration and sustainability. At Deerns, we are addressing this head-on, designing hybrid and scalable cooling systems that work across new builds and legacy facilities alike.

A New Era: The Rise of High-Density Computing

Traditional data centres were built for IT racks drawing around 7-10 kW. Today, we regularly encounter installations pushing 50 kW, even 150 kW in extreme cases.

" The challenge? Managing immense heat generation in the same or smaller physical spaces.
Christopher Leahy Project Director

Simply blasting more chilled air no longer works. Air cooling struggles with uniformity and efficiency in tightly packed, high-load environments.

This evolution has pushed the industry into a transitional phase where hybrid systems – combining traditional air cooling with advanced liquid methods – are becoming the norm.

Inside Liquid Cooling: Technology That Works Smarter

Liquid cooling is more than a trend – it’s a requirement for the data centres of tomorrow. At Deerns, we’ve explored and deployed a range of these systems:

  • Immersion Cooling: Servers are submerged in dielectric fluid that absorbs heat. The fluid is continuously circulated, cooled and returned to the tank. Operating temperatures around 35-36°C make this method extremely energy efficient, reducing the need for traditional chillers.
  • Direct-to-Chip Cooling: Coolant flows through plate heat exchangers that are in direct contact with heat intensive chips. This removes heat at the source, maintaining performance without overtaxing the entire HVAC system.
" What makes these systems appealing is not just efficiency but adaptability. Both technologies can often be retrofitted into existing installations with minimal disruption, particularly when designed with foresight.
Christopher Leahy Project Director

Retrofitting and Integration: Turning Challenges into Solutions

Legacy data centres come with their own set of challenges – from strict service level agreements (SLAs) that limit temperature ranges to infrastructure that appears incompatible with modern systems. Yet, with smart engineering and thoughtful design, these gaps can be effectively bridged.

Deerns specialises in integrating cutting edge cooling technologies into both new builds and legacy environments by:

  • Leveraging existing chilled water loops to power liquid cooling systems.
  • Using return water from air-cooled systems (typically ~32°C) to supply liquid cooling loops that operate efficiently at ~36°C.
  • Implementing advanced Building Management Systems (BMS) for granular power monitoring, from busbars to fans to pipework.
  • Ensuring compliance with local building codes, regardless of where the equipment was manufactured.

Our teams often navigate complex international codes and multi-supplier ecosystems, ensuring seamless system compatibility.

Regulatory Pressures: The PUE Challenge

Power Usage Effectiveness (PUE) is now a cornerstone metric for data centres. It measures how much power is used to run the IT equipment versus the total energy consumed. A lower PUE indicates better energy efficiency.

In Europe, regulatory directives are evolving quickly. Germany is leading with energy usage limits, and wider EU regulations will soon follow. While many centres previously operated at a PUE of 1.5 or higher, the current target is 1.3 or lower. Some industry leaders aim for 1.2 or better.

Deerns helps clients meet these goals not just through efficient cooling, but by applying holistic energy strategies that include real time power monitoring, predictive energy modelling and system level design aimed at reducing parasitic loads. We also explore alternative energy sources, such as Hydrotreated Vegetable Oil (HVO) fuel, as cleaner replacements for traditional diesel backup generators.

We are currently collaborating with with a pioneer equipment supplier in Barcelona on test cases, helping refine immersion technology through real world implementation. This hands-on experience keeps us, and our clients, ahead of the curve.

Looking Forward: What the Next 5-10 Years Hold

Forecasting the future of data centre cooling is tricky. Just five years ago, few predicted today’s reliance on immersion and chip level cooling. What we do know is that the next leap forward will be driven by AI, Edge computing and rising IT power densities.

Here are 5 trends we expect to shape the next decade:

  • AI Acceleration: AI models require massive processing power, pushing cooling demands even higher. Cooling systems must not just keep up – they must anticipate growth.
  • Server-Cooling Co-Design: Cooling will no longer be an afterthought. Manufacturers will need to design servers and chips in tandem with cooling systems.
  • Cleaner Backup Power: Diesel is on its way out. HVO and even modular nuclear generators may play a role in future energy strategies.
  • Waste Heat Reuse: Data centres will increasingly capture and reuse waste heat. At Deerns, we are already designing systems in Milan to supply this heat to district heating networks.
  • Embodied Carbon Tracking: We’re developing consulting frameworks that track carbon impact from material sourcing through to long-term operation.

In short, the next decade will demand not just smarter cooling, but smarter thinking – where innovation, sustainability and performance are engineered as one.

Let’s talk

Colin Wyatt

Sector Director Data Centres

Array