Hurdles for Free Cooling

It is indeed a lot easier for Facebook, Google and Microsoft to operate data centers with "free cooling". After all, the servers inside those data centers are basically "expendable"; there is no need to make sure that an individual server does not fail. The applications running on top of those servers can handle an occasional server failure easily. That is in sharp contrast with a data center that hosts servers of hundreds of different customers, where the availability of a small server cluster is of the utmost importance and regulated by an SLA (Service Level Agreement). The internet giants also have full control over both facilities and IT equipment.

There are other concerns and humidity is one of the most important ones. Too much humidity and your equipment is threatened by condensation. Conversely, if the data center air is too dry, electrostatic discharge can wreak havoc.

Still, the humidity of the outside air is not a problem for free cooling as many data centers can be outfitted with a water-side economizer. Cold water replaces the refrigerant, pumps and a closed circuit replace the compressor. The hot return water passes through the outdoor pipes of the heat exchangers. If the outdoor air is cold enough, the water-side system can cool the water back to the desired temperature.

Google's data center in Belgium uses water-side cooling so well that it
does not need any additional cooling. (source: google)

Most of the "free cooling" systems are "assisting cooling systems". In many situations they do not perform well enough to guarantee the typical 20-25°C (68-77 °F) inlet temperature the whole year around that CRACs can offer.

All you need is ... a mild climate

But do we really need to guarantee a rather low 20-25°C inlet temperature for our IT equipment all year round? It is a very important question as the temperature in large parts of the worlds can be cooled with free cooling if the server inlet temperature does not need to be so low.

The Green Grid, a non-profit organization, uses data from the Weatherbank to calculate the amount of time that a data center can use air-side "free cooling" to keep the inlet temperature below 35°C. To make this more visual, they publish the data in a colorful way. Dark blue means that air-side economizers can be efficient for 8500 hours per year, which is basically year round. Here is the map of North-America:

About 75% of North-America can use free cooling if the maximum inlet temperature is raised to 35°C (95 °F). In Europe, the situation is even better:

Although I have my doubts about the accuracy of the map (the south of Spain and Greece see a lot more hot days than the south of Ireland), it looks like 99% of Europe can make use of free cooling. So how do our current servers cope with an inlet temperature up to 35 °C ?

Free Cooling for the Data Center? Servers and High Inlet Temperatures
Comments Locked

48 Comments

View All Comments

  • lwatcdr - Thursday, February 20, 2014 - link

    Here in south florida it would probably be cheaper. The water table is very high and many wells are only 35 feet deep.
  • rrinker - Tuesday, February 11, 2014 - link

    It's been done already. I know I've seen it in an article on new data centers in one industry publication or another.
    A museum near me recently drilled dozens of wells under their parking lot for geothermal cooling of the building. Being large with lots of glass area, it got unbearably hot during the summer months. Now, while it isn't as cool as you might set your home air conditioning, it is quite comfortable even on the hottest days, and the only energy is for the water pumps and fans. Plus it's better for the exhibits, reducing the yearly variation in temperature and humidity. Definitely a feasible approach for a data center.
  • noeldillabough - Tuesday, February 11, 2014 - link

    I was actually talking about this today; the big cost for our data centers is Air Conditioning; what if we had a building up north (arctic) where the ground is alway frozen even in summer? Geothermal cooling for free, by pumping water through your "radiator".

    Not sure about the environmental impact this would do, but the emptiness that is the arctic might like a few data centers!
  • superflex - Wednesday, February 12, 2014 - link

    The enviroweenies would scream about you defrosting the permafrost.
    Some slug or bacteria might become endangered.
  • evonitzer - Sunday, February 23, 2014 - link

    Unfortunately, the cold areas are also devoid of people and therefore internet connections. You'll have to figure the cost of running fiber to your remote location, as well as how your distance might affect latency. If you go into permafrost area, there are additional complications as constructing on permafrost is a challenge. A datacenter high in the Mountains but close to population centers would seem a good compromise.
  • fluxtatic - Wednesday, February 12, 2014 - link

    I proposed this at work, but management stopped listening somewhere between me saying we'd need to put a trench through the warehouse floor to outside the building, and that I'd need a large, deep hole dug right next to building, where I would bury several hundred feet of copper pipe.

    I also considered using the river that's 20' from the office, but I'm not sure the city would like me pumping warm water into their river.
  • Varno - Tuesday, February 11, 2014 - link

    You seem to be reporting on the junction temperature which is reported by most measurement programs rather than the cast temperature that is impossible to measure directly without interfering with the results. How have you accounted for this in your testing?
  • JohanAnandtech - Tuesday, February 11, 2014 - link

    Do you mean case temperature? We did measure the outlet temperature, but it was significantly lower than Junction temperature. For the Xeon 2697 v2, it was 39-40 °C at 35°C inlet, 45°C at 40°C inlet.
  • Kristian Vättö - Tuesday, February 11, 2014 - link

    Google's usage of raw seawater for cooling of their data center in Hamina, Finland is pretty cool IMO. Given that the specific heat capacity of water is much higher than air's, it more efficient for cooling, especially in our climate where seawater is always relatively cold.
  • JohanAnandtech - Tuesday, February 11, 2014 - link

    I admit, I somewhat ignored the Scandinavian datacenters as "free cooling" is a bit obvious there. :-)

    I thought some readers would be surprised to find out that even in Sunny California free cooling is available most of the year.

Log in

Don't have an account? Sign up now