Implementing a Data Centre Liquid Cooling System (in 10 steps or less)

DCX The Liquid Cooling Company
7 min readDec 18, 2018

--

Data Centre Direct Liquid Cooling systems are no longer a glimpse into the future, they are now a reality. This article presents the steps required to make the change from traditional HVAC systems. First published in 2018 November Issue of Digitalisation World magazine (https://digitalisationworld.com/). We will cover each point in detail later, but for now just 5 minute read.

There are many articles already praising liquid cooling technology, although not a single one actually provides informed advice how to implement liquid cooling technology and why data centre operators should start working now on direct liquid cooling adaptation.

Implementation of liquid cooling is cost effective and simple solution for energy cost, climate change and regulatory challenges. In addition to this, most liquid cooling systems can be implemented without disruption in operation. We will advise how to choose architecture that will support specific heat rejection scenario, what is required and how to conduct smooth transition from air cooling to liquid cooling. Let’s find out what must be done.

1. Change your mindset

None of us likes change. We like to keep things the same, we accept industry standards and preferences without the second thought and avoid unnecessary risks. To move on from Air Cooling effort must be made. Liquid cooling has evolved since being introduced in late sixties by IBM. Current DLC / ICL vendors started around 2005 and in that time have produced thousands of cooling components. We can assume that liquid cooling systems are now both proven and mature. The biggest risk related to liquid cooling is that you might not evaluate it.

2. Check the facts and standards

The Uptime Institute 2018 Data Centre Survey Results showed that 14% of datacentres have already implemented liquid cooling solutions. There will be a data centre near you that has implemented direct liquid cooling or performing Proof Of Concept implementation. From our experience — most of cloud providers like to keep their liquid cooling system secret as their competitive advantage.

In terms of standards, in 2011 ASHRAE introduced Thermal Guidelines for Liquid-Cooled Data-Processing Environments and then Liquid Cooling Guidelines for Datacom Equipment Centres 2nd Edition in 2014. Thermal Guidelines for Data Processing Environments 3rd Edition included insight into other considerations for liquid cooling covering condensation, operation, water-flow rate, pressure, velocity, and quality, as well as information on interface connections and infrastructure heat-rejection devices. Currently LCL.GOV and OCP started liquid cooling standardisation effort for wider adoption. The 2011–2014 standards have evolved, there are couple of vendors that deliver full portfolio of DLC / ILC systems.

3. Do your own research

The problem is that not many data centre infrastructure integrators know anything about liquid cooling. You will also not hear about liquid cooling from data centre HVAC people. It simply hurt’s their business.

You have to do your own research and look actively for available solutions. If you want to consider more expensive proprietary systems — Dell, Lenovo, Fujitsu and some other players like Huawei already have direct chip liquid cooling solution (and also immersion system in case of Fujitsu). These are attached to specific server models, mostly HPC platform which is a constraint obviously. However, there are already many direct chip and immersion liquid cooling vendors with complementary solutions that may be applied to your servers or can accommodate most of the servers on the market.

4. Evaluate hot coolant, direct liquid cooling systems only

There are established far from heat source liquid cooling systems (CRAH, Overhead, InRow™, Enclosed Cabinet, Rear Door Heat Exchanger), but when the heat is transferred directly from the source, the facility supply liquid temp may be “warm water” (ASHRAE Class W4: 35°C to 45°C) or even “hot water” (W5 above 45°C). With Delta T of 10 °C outlet temp may reach over 55 °C which facilitates heat re-use for building or community heating.

Don’t mix chilled water cooling or indirect LC with hot fluid direct liquid cooling. The goal is to resign completely from mechanical cooling and use liquid — free cooling combo for the whole year. We want to resign from chilled water use and extract heat directly from components, not from the air inside the rack or data centre space.

5. Asses and measure the benefits:

Benefits of liquid cooling are widely known. But what it means in reality? Well, let me provide few examples:

  • - Increase in rack power density (from 20 kW to 100 kW +)
  • - Lower data centre footprint, fewer server racks and interconnects
  • - 30–50 less energy use & cost
  • - 10–20% increase in computing power of liquid cooled processors and gpu’s
  • - Increased reliability of equipment
  • - Higher power density of processors
  • - Fewer Pieces of Critical Equipment in Data Hall Space
  • - Simplified electrical and mechanical topology
  • - Faster go to market — reduced Site & Structural Construction Compared to Traditional Build
  • - Reduction/elimination of fan vibrations
  • - Smaller CAPEX and greatly just fraction of traditional datacentre OPEX
  • - Decrease Total Cost of Ownership (TCO)
  • - Reuse of usually wasted heat

What that means — shrinking datacenter size, less complexity, simplified mechanical systems topology which effects in more reliability. From building point of view reduced site & structural construction, compared to traditional build of equal computing power means 30% reduction in construction schedule. Simply: go to market is significantly faster because of less “long lead time” equipment and fever pieces of critical component. Of course, opportunity for heat recovery might be a benefit, but most visible result is significant decrease in energy use, maintenance and construction cost.

6. Know required Direct Chip Cooling or DLC components.

Sometime less is more and that’s the case with direct to chip liquid cooling. Everything becomes simpler. Direct chip liquid cooling (DLC) requires just 3 components: server modules, LDU’s and CDU’s. Simple as that. Server components are processor modules as this DCX Intel Processor Module that extracts the heat at the source. Those are usually installed within any standard rack and fit nicely into any server model as processor sockets are standardised. This allows for retrofitting of existing data centre infrastructure without major overhaul. What is additionally needed are: LDU’s — (Liquid Distribution Units called sometime “manifolds”) as this LDU and CDU’s (Coolant Distribution Units). For example this leak proof VCDU with negative pressure circulation, or datacenter proven CDU system made by Nortek

7. Know required Immersion Liquid Cooling (ILC) components

Immersion liquid cooling completely changes current the datacentre infrastructure. Existing standards are not sacrosanct, and most people doesn’t even know why 19” racks are the actual standard, why servers look like pizza boxes, what is the purpose of rack doors, what was the reason to use raised floor and finally why 90% of racks are pitch black. We just accept those without asking questions. The industry group (Open Compute Project) questions everything and we can expect some changes soon. Immersion cooling will be coming into data centres. Many experts believe that this will be ultimate future liquid cooling system and DLC is just a step ahead in energy efficiency competition. We are also convinced that full immersive cooling with dielectric fluids is the future of thermal management for ICT components but also for batteries and electric components. There are quite interesting designs already as this datacenter scale immersion system.

8. Choose the technology and supplier

Looking into two most recognised direct liquid cooling types you may find at least 15 vendors to choose from. No problem referring to other vendors as we believe that products features and quality should speak for itself:

· Direct Chip Cooling: Aquilla, Asetek, Chilldyne, Coolit Systems, DCX, Iceotope. From recognized hardware vendors: Dell, Lenovo, Fujitsu, Huawei.

· Immersion liquid cooling: DCX, Green Revolution Cooling, LiquidCool Solutions, Midas Green Technologies, Submer and one hardware supplier: Fujitsu.

My personal advice — look for open system, not custom “boutique solution” that requires special servers to operate. Look for flexibility — you should be able to operate the same DLC loop using different type of servers.

The best way to start — just ask those people about their offering, features, competitive advantages, experience, available options, and prices. Simple as that.

9. Assess the risks and shortcomings — keep it real

There are many myths about liquid cooling, usually written by technology journalist without experience on this technology. Let’s keep it real about the risks and limitations cooling technology may have and how to mitigate those. It’s not hard to avoid critical issues. Octave Klaba, CEO of OVH (with over 350 000 liquid cooled servers in operations) confirms 3 to 4 leaks per year impacting few servers each time.

10. Finally, the secret of getting ahead is getting started

If sustainability of the solution or rising power cost are not incentive enough, just do it for the money savings! Just do it and evaluate the solution. Contact the vendors, check available systems. Do the POC implementation. You don’t want to be left behind. Charles Darwin said:

“It is not the strongest of the species that survives, nor the most intelligent. It is the one that is most adaptable to change.”

And the change is coming now.

ps. this OVH Cubic DC: located nearby Roubaix centre. 6 levels, 96 racks on each level, total number= 576 racks. Depending on infrastructure density — 60 to .. 90 servers per rack which gives between 30 to 50 thousand servers. On 25x25 meters footprint. This is why you’ll be cooling your servers with liquid. Or you’ll be closing your business.

--

--

DCX The Liquid Cooling Company

Transforming #datacenter with NextGen #liquidcooling Systems for #HPC, #BigData, #Edge #Datacenters & #CryptoMining industry. #sustainability #energyefficiency