Criteria for planning

Can liquids cool data centers a thousand times more efficiently than air? Or is this just an exaggerated, general advertising statement? In our estimation, the range is probably between 50 and 1,000 times more efficient.

Is that enough of a basis for deciding on a major change in the type of cooling? Data centers should consider several aspects when planning for the future:

  1. Processing power continues to increase, and with it the operating temperature in the racks. This is why heat dissipation management is becoming increasingly important. There is no doubt about that.
  2. Air cooling is established and widespread. It has proven its worth and need not be rejected across the board. The various concepts have undergone continuous development and improvement.
  3. The classic configuration with hot aisle, cold aisle, ducts, fans, and cooling systems is often no longer sufficient. It is becoming difficult to achieve a PUE value that meets climate targets.
  4. Liquid cooling is physically more efficient than air cooling. Due to their heat capacity and density, liquids can dissipate heat up to 1,000 times more efficiently.

However, there are some additional factors to consider when planning future heat dissipation:

  • Power density per rack whose heat load is to be dissipated
  • Computer workloads and packing density
  • Reconfiguration and patching needs
  • Infrastructure for liquid cooling in the computer room
  • Total cost of ownership (TCO)

 

 

General assessment

The TCO of liquid-cooled systems is likely to be significantly lower than air cooling. However, the initial investment is likely to be higher.

Heat exchangers with liquid cooling systems on the racks are relatively easy to install. However, chip cooling and immersion cooling of the devices require specially designed equipment.

With a total output of less than 20 kW per rack, conventional air cooling will continue to be sufficient in the future. If it exceeds this threshold value, R&M believes that a strategy for liquid cooling should be taken into consideration.

 

In immersion cooling, individual servers or entire systems are immersed in a dielectric coolant.

 

Looking into the future

Further progress can be expected on the microelectronics side, with their energy efficiency increasing all the time.

Innovations such as co-packaged optics (CPO) aim to minimize electrical transmission between chips and fibers and reduce transmission losses. At the same time, they can increase the number of transmission channels. There is no need for conventional connectors. This means that the front panels of the devices can be designed more openly in spite of a higher power density in order to allow more ventilation.

Greater performance and higher density in the racks therefore do not necessarily mean that the cost of heat dissipation will increase proportionally in the future.

 

In direct chip cooling, small channels take the coolant into the active electronic equipment.

 

Liquid cooling also needs air

Liquid cooling is a generic term for the way in which heat is dissipated, although there are generally three main types: rear door heat exchangers, direct chip cooling, and immersion cooling.

  • In the case of heat exchangers on the rear of the racks (rear door heat exchangers), fans blow the warm air through a grid of pipes containing coolant. The liquid absorbs the heat, is pumped to an external cooling mechanism, and returned to the heat exchanger.
  • In direct chip cooling, small channels take the cooling fluid into the active electronic equipment. The channels and heat sinks are arranged above or below the heat-generating chips. This approach requires additional air circulation to dissipate the residual heat.
  • In immersion cooling, individual servers or entire systems are immersed in a dielectric coolant. The liquid circulates and transports the heat to external cooling systems. This is the most efficient form of cooling.

 

Classic DC cooling management comes under pressure. Data center operators must find new ways to make «green data centers» a reality. Energy costs and legal regulations are also necessitating action.

 

Infrastructure solutions from R&M

BladeCooling Plus from R&M Tecnosteel

With the BladeShelter family, R&M makes it possible to integrate cooling solutions seamlessly into the computer hall infrastructure. The family includes the modular, scalable BladeShelter cooling unit. It can be used in closed and open architectures.

Get to know the BladeShelter family here.

Discuss the pros and cons of liquid cooling with R&M on LinkedIn. Share your experiences.

We recommend this independent technical article by Jeff Schuster, published by Mission Critical. It deals comprehensively with the question: Is liquid cooling right for your data center?