As temperatures soar across the UK, enterprises and commercial data centre operators are double checking their cooling equipment with many likely to deploy back up cooling.
Unconfirmed reports say some organisations have been forced to curtail some services because of overheating servers.
The Guardian newspaper web site reported: "There may be a pause in live blogging after 4pm, amusingly because we’re having to switch to backup servers because our main ones have overheated."
Data centres must maintain constant temperatures to stop IT equipment overheating.
This is usually set at around 22-23C. But as temperatures in North West London hit 34.5C, making it the hottest day since 2006, data centre operators must contend with high outside temperatures and massive amounts of heat generated by thousands of individual pieces of server, storage, and network equipment.
Data centres, of which there are over 200 dedicated buildings in the UK, range from small footprint buildings to entire campuses in locations such as London Docklands, Slough, Crawley and Manchester.
These buildings are either wholly owned by large institutions such as banks or operated on a commercial basis by international players such as Digital Realty, Rackspace, Equinix and Interxion and indigenous companies like Virtus, Iomart, Redcentric and Pulsant.
Russell Poole, MD UK+I at Equinix said: "Equinix always schedule preparatory work during spring every year to get us ready for the summer."
"In addition, the design standards of our buildings further ensures we maintain the highest level of performance even in extreme heatwave cases like the one experienced this week in the UK"
Tom Kingham, Director of Sales Engineering at Digital Realty, said: "All our data centres are designed to withstand the most extreme temperatures ever recorded in the region. Each data centre also has additional cooling capacity available to handle maintenance events and emergency situations."
New cooling approaches
Depending on their age, many data centres were kept cool through traditional air conditioning systems.
More recently firms have opted for cooling technologies such as indirect evaporative cooling technologies from firms like Excool. In these systems the hot air inside the data centre is not moved outside or mixed with outside air but is sealed within a system. The air is moved from inside the data centre to outside units and cooled through heat exchangers. On hot days these heat exchangers are cooled with fine mists of water.
For data centre operators it depends on what type of cooling equipment they use and the size of the workload they are handling.
For those using air conditioning and mechanical chillers they must make their equipment work harder to blow more air through the servers. Another option is to allow the temperature inside the data centre to rise by a few degrees. However while this saves on costs it can push the envelope of recommended operating temperature for equipment. This is very risky as server equipment, especially old servers, can overheat.