By: Clive Longbottom, Head of Research, Quocirca
Published: 18th March 2010
Copyright Quocirca © 2010
When presenting on how to build a modern data centre, I generally point to the fact that a data centre should be built for the machines, not for the humans, and an air temperature of around 26–27 degrees C is fine for most equipment—provided that the real temperature-dependent items are cooled through forced, ducted air cooling. If this is combined with hot aisle/cold aisle or contained rack systems, the warm air can generally be ducted away to use for space heating elsewhere in the building—so saving energy and cutting costs, while also boosting green credentials.
But, this only works well where there is need for heating space where humans work: fine for the UK during the cold spell we've been having; not so good for the Middle East, where space cooling is more of a requirement. However, even these places have a need for hot water, and it is possible (if a little expensive) to use a heat pump to extract the low-grade heat from data centre exit gases and upgrade it to the temperatures required for hot water.
There is, however, an alternative. Back in the good ol' days of the mainframe, water cooled systems were used extensively, but fell out of fashion as distributed systems became the norm and air cooling was seen as being more cost effective. But, energy costs are getting more unpredictable in the short term, with the only certainty that, in the longer term, they will go up. At the same time, equipment densities are creating more hot spots and less air volume space for effective air cooling to take place. The cost of designing, implementing and maintaining effective air cooling systems for today's data centres is getting beyond the reach of many organisations (and even vendors), and the possibility of water cooling is once again on the agenda.
In its research laboratories in Zurich, IBM has been investigating the best way of implementing water cooling for distributed systems in a modern data centre. It has come up with a nickel-coated copper block that sits on top of the CPU, replacing the standard CPU heat fins/fan ensemble. The copper block is micro-drilled to maximise the cooling capacity of the water. For blade-based and rack mounted systems, the feeder tubes to the blocks are routed to the back of the board, and a clever isolation system is used to ensure that the boards can be hot swapped without the need for the water system to be shut down and without any water leakage into the electrical systems.
The cleverest part is that in order to cool a CPU down to the best working temperature using air requires a massive temperature difference between the CPU surface and the air—generally speaking, data centres use air at around 18C to ensure that CPUs dont heat above 75C. With water being a far better conductor of heat, the delta can be far less—IBM reckons that using water at 60C will still maintain a CPU at 75C. Why use water at 60C? This is the temperature of the hot water systems in the majority of buildings. By using a closed system for the distilled water needed for the cooling circuit, the exit water at 65 degrees C or higher can be used to provide input heat directly in to the general hot water supply in the building—saving a much greater amount of overall energy and therefore cost.
The problem would be if the water supply failed—there would be far less time to ensure that systems were shut down in order for the CPUs to remain unharmed. There's also the small problem of what if a leak did happen: not a problem with air based systems, but slightly more where water and electricity are concerned! However, neither of these issues should be show-stoppers. Self-contained systems can ensure that water is available in all but the worst case scenarios. Automated systems can very rapidly shut down servers and leakages can be avoided through the appropriate use of the right materials and well engineered plumbing.
Even for large remote data centres and service provider data centres where there will be little need for hot water in the building itself, water cooling can have a part to play. Most data centres will be within a commercial or industrial environment, and the hot water can be sold or just passed on to those who can make use of it, depending on whether the organisation wants to maximise its cost savings or to up its green credentials even further.
As the cost of power and the density of data centre equipment both continue to increase, it may be time to take the plunge back into water cooled systems, and pass the benefit on to the business through helping keep its hot water needs met.
We have not received any comments against this entry. Why not be the first?
We automatically stop accepting comments 180 days after a post is published. If you would like to know more about this subject, please contact us and we'll try to help.
Published by: IT Analysis Communications Ltd.
T: +44 (0)190 888 0760 | F: +44 (0)190 888 0761