This past April, Facebook began operations at its second data center in the U.S., in Forest City, N.C., which joined its already existing facility in Prineville, Ore. But how does the social network deal with the differences in climate between the two locations?

Facebook Engineer Daniel Lee divulged some of the technical details in a post on the Open Compute Project blog:

Building a data center based on Open Compute Project designs in a relatively hot and humid area like Forest City, N.C., presented some interesting challenges. Chief among them, of course, was whether the 100 percent outdoor air-cooling system Facebook debuted in our Prineville, Ore., facility could operate as efficiently in an environment where the ASHRAE 50-year maximum wet bulb temperature is 21 percent higher, at 84.5 degrees Fahrenheit, instead of 70.3°F.

The design we use in Prineville works great in central Oregon — a high desert plain with hot, dry summers and cool evenings and winters. These are ideal conditions for using evaporative cooling and humidification systems, instead of the mechanical chillers used in more conventional data center designs. And ASHRAE 50-year design weather maximums and bin weather data show that such a system is sufficient for even the hottest summer days in central Oregon.

When we started looking at Forest City, the bin weather data suggested that refrigeration might not be required, but ASHRAE 50-year design weather maximums suggested otherwise. We ultimately decided to install a direct expansion (DX) coil system in the facility, just in case it might be needed, but it was important to us to find a way to make the free cooling system work — the potential efficiency gains to be found in keeping those DX units switched off were just too great to ignore.

Summer 2012 in the U.S. presented an excellent test of the design. July 2012 was the second-hottest month on record in North Carolina, in the third-hottest summer ever documented in the continental U.S. At one point, the dry bulb temperature topped 100°F outside the data center.

But despite the record-breaking dry bulb temperatures, we didn’t run the DX coils at all this past summer. If you look at the trend data, it shows that when the record hot days occurred, relative humidity was low, allowing the misting system to provide all the needed cooling.

So it turns out this design can work in relatively high heat and humidity, and it can work as efficiently as it does at Prineville. In fact, the power usage effectiveness for Forest City clocked in at 1.07 this summer, versus Prineville’s 1.09 during roughly the same period. It remains to be seen how efficiently the system would perform if both the DB temperature and the WB temperature were to simultaneously exceed the limits we’ve set, but the likelihood of such an event in this region is low, and its impact on annualized PUE would likely be minimal.

Readers: Are you surprised that Facebook was able to attain similar results at its two data centers in the U.S., despite the vastly different climates of their respective areas?