Nigel Gore, Global Offerings, High Density and Liquid Cooling at Vertiv
Data centres continued to flourish in 2020, due to large growth in hyper-scale and colocation facilities. New applications in artificial intelligence (AI), machine learning, deep learning, connected devices and the Internet of Things (IoT) also increased demand for computing power.
However, as data centres grow, so do their carbon footprints. The EU Commission recently noted that data centres and telecoms are responsible for a significant environmental footprint, and “can and should become climate neutral by 2030”. To work toward meeting this goal, the industry needs to implement innovative cooling technologies.
Liquid cooling systems can remove heat up to 1,200 times more effectively than air. This is because liquid can provide cooling mechanisms directly to the heat source, rather than indirectly via fans or convection systems. Liquid cooling also makes it easier to transport heat, which opens up the possibility of reusing this energy elsewhere. For example, in Switzerland, tests have seen the heat removed by liquid cooling used to warm up local swimming pools.
Innovation in liquid cooling is advancing in line with the global drive for energy efficiency. For example, Google is using sea-water cooling for its new data centre in Finland. And PlusServer is building a data centre in Strasbourg that uses groundwater at a fixed 10 to 14 degree Celsius as feed-liquid for cooling air in the data centre. Microsoft has also recently declared its experiment in submerged data centres a success, after trialling the system off the Scottish coast. The pre-built unit contained 864 servers and 27.6 petabytes of storage, the first of potentially many sub-sea containers.
Liquid cooling heats up
According to Forbes, US data centres now use more than 90 billion kilowatt-hours of electricity a year, requiring roughly 34 giant (500-megawatt) coal-powered plants. Global data centres used roughly 416 terawatts of power, or about 3% of the global electricity requirement of the planet last year. That is almost 40% more than the entire UK energy requirement.
We expect this data centre energy consumption to double every four years. This is because the new digital economy is pushing for an increase in computing power which means that data centre applications now generate much more heat. Whereas traditional rack densities were less than 10Kw, Artificial Intelligence and High-Performance Computing applications require new server and GPU hardware infrastructure. Vendors such as NVidia, AMD and Intel are also developing heat generating, more powerful chips, with a cumulative effect of higher rack densities. This necessitates cooling solutions that are capable of handling in excess of 60kW per rack.
Data centres will face a struggle to reduce the extra heat generated by these new demands using traditional air cooling systems. Air-cooled racks, using Computer Room Air Conditioning (CRAC) alone, are no longer up to the job for high-density workloads. The thermal and physical properties of air limit its ability to capture heats. Liquids, or combinations of CRAC and liquid cooling, are necessary to cool data centres with next-generation high-density chip architectures and applications.
Liquids are an order of magnitude better at capturing heat. For example water, with a heat transfer value of 4179 J/KgK, is much more efficient than air by volume. Liquid can transfer heat away from the most sensitive and critical components of a CPU and GPU. This is likely why Omdia’s Data Centre Thermal Management Report 2020 found that the growth of liquid cooling methods, such as immersion, will double between 2020 and 2024.
Cold plate and Immersion Cooling
At Vertiv, we are experiencing demand for liquid cooling deployments across our global customer base. Liquid cooling can help data centre managers deliver more compute power while reducing the facility’s overall power consumption and improving its power usage effectiveness (PUE).
Cold plates are increasing in popularity for liquid cooling. They are mounted directly to the surface of high heat-generating equipment inside the server. This provides a closed-loop of cooling fluid to pump directly to the chip. Multiple cold plate designs exist. These include CPU only cold plates but can extend to designs that capture heat from CPUs, GPUS and memory components within the IT devices.
An alternative method is immersion cooling. This requires a server to be completely submerged in a dielectric fluid, eliminating the need for air cooling entirely.
Liquid cooling has undoubted benefits but both cold plate and immersion techniques present challenges for data centres. Both methods are a radical departure from traditional thermal management approaches and many data centres have legacy infrastructure that can’t easily switch from air to liquid cooling systems.
Moving to liquid cooling
Converting to liquid cooling requires a full-scale exercise to map out equipment with the highest heat output. This includes examining the networking, telecommunications, power delivery, payback analysis, trade-off comparison and data storage infrastructure in detail. Other costs to factor in labour overheads and the level of conversion required to switch facilities to a liquid-cooled environment.
But as sustainability becomes increasingly important, the choice is between rebuilding a data centre from scratch or retrofitting a liquid cooling option. To enable retrofitting, an increasingly popular method is to deploy liquid cooling in an air-cooled data centre, focusing on specific high-density racks. But, each project comes with its own specific requirements. Consulting with experts at the outset prevents project overruns and unexpected costs.
A sustainable future
We need to be more innovative in the way we use and adopt liquid cooling solutions, especially as data centres come under renewed scrutiny on their energy needs. A recent initiative involved a 5G infrastructure business pumping the water used for liquid cooling into a housing complex. This heated the domestic water supply and radiators – at the same time as cooling their servers.
We will also need collaboration between industry and academia. That will enable us to provide competitive and cost-effective solutions. For example, Vertiv works with Center for Energy-Smart Electronic Systems (ES2) partner universities to work on furthering the efficiencies of datacenters. This type of partnership can assist in developing compatible materials, further our understanding of fluid hygiene filtration and containment methods, and optimising controls for liquid cooling systems.
The need for data will continue to grow. We can’t put that genie back in the box. It’s how we manage the consumer demand for data, and the energy it requires, that will shape the future of the data centre industry. The aspiration is that liquid cooling innovations will help reduce our carbon footprint and create a more sustainable future.