Stulz: data cooling – Derek Siveter




The recent IT crash at RBS showed what can happen when banking systems fail. Millions of customers were left without access to their current accounts, unable to receive payments and fulfil their financial obligations. Although the group was able to solve the problem quickly, the backlog of payments, fines and damage to the bank's image took time to rectify.

"Any outage is extremely expensive," says Derek Siveter, director at Stulz UK. "The case with RBS shows that. It happened over a short space of time, but it took ages to pick up and get confidence back in their market."

That's why financial service companies are particularly interested in the cooling industry - the systemic importance and sensitivity of the information they handle means that higher standards of security and resilience are demanded for their data systems.

"Compared with a lot of commercial data centres, the financial sector is much more focused on security."

"That stipulation means we rely on best engineering practice rather than value engineering," Siveter says. "It's something that the financial firms, their architects and consultants will pay for: a good, resilient and secure system."

But that's not true of every industry. Compared with a lot of commercial data centres, the financial sector is much more focused on security. Most institutions ask for dual-power supplies and dual-cooling sources to be fitted to their equipment to avoid single points of failure.

Energy costs tend to be secondary in this security-driven approach. Financial firms are certainly more switched on to environmental sustainability than they were in the past, but the drive towards green data centres has its limits.

"Our customers do a lot of green research in their annual reports, but they're fundamentally cautious," Siveter says. "A few years ago, a new range of fans was launched that promised really big energy savings. We talked to one US bank about running-cost savings of hundreds of thousands of pounds every year, but because the technology was new and hadn't been tried and tested, he said that with the huge amounts of data the bank handles, he didn't want to take the risk in case the systems go down. While banks are concerned about energy costs, those savings are very much secondary to good engineering."

Finance's new technology anxiety

Most companies use computer room air-conditioners, but the anxiety about new technology persists. Industry talk at the moment is all about direct free cooling, which brings in fresh air without any mechanical cooling. Big data centres such as those used by Google and Yahoo in the US use these techniques.

"Industry talk at the moment is all about direct free cooling, which brings in fresh air without any mechanical cooling."

"There are higher temperatures in the room, so it's all very green and low energy," says Siveter. "But it's easier for these companies to experiment than it is for financial institutions. If Facebook or Google goes down, nobody is going to worry too much about it. But if a banking system goes down, it's extremely expensive. I don't think that many banks are looking at direct free cooling at the moment. It has great advantages in low energy, but it's not very secure and doesn't have the resilience that the banks want."

Those financial groups that avoid the 'riskier' new technology still experience problems with their data centres. Some of the issues are down to the mechanics of air-conditioning equipment, but most are due to a fault with the electrical supply.

"Anything that's mechanical will end up breaking," Siveter says. "But most systems are usually designed with some kind of back-up if one part fails. That's the best way of avoiding costly downtime and data loss. Banks tend to duplicate everything. It all comes down to the designers making sure that there are electrical supplies available for the mechanical equipment. At Stulz, we have dual-power supplies, so if there is an electrical supply fault, they will automatically switch over to a standby power supply."

Stulz: specialist service

Stulz makes exclusive use of electronically commutated fans for its systems. The equipment has a reliable and highly efficient fan motor, with a power input that is much better than the traditional motor and belt systems that were common nine years ago.

"On the cooling side, we try to use indirect methods during winter," says Siveter. "This means that the compressors in the cooling units, which take about 80% of the power, are not required. You still need the fans to circulate the air, but you can transfer it from the outside area into the data centre and still keep the integrity of the centre in one piece."

"Stulz makes exclusive use of electronically commutated fans for its systems."

Unlike companies that provide different forms of air-conditioning to commercial buildings, offices and residential places, Stulz only supplies cooling solutions for data centres. That puts the firm in a good position to understand a market that is particular to financial institutions.

"We're experts in cooling, and have been in business in the UK for 25 years and in Germany for 50 years," Siveter says. "Over the years, banks have been some of our biggest customers. Some of these are small branch offices, but we've also worked with the big data centres at Barclays, RBS, Lloyds, HSBC, Credit Suisse and Santander. We are currently installing a a huge data centre for Santander in Leicester."

Stulz has also worked with Barclays to install two large systems with its computer room air-conditioning (CRAC) units.

"The whole infrastructure was extremely well designed," says Siveter. "We had two teams of consulting engineers, with one overseeing the others to eliminate any single points of failure. That means they are absorbing a lot of server heat. The designers are making sure there are significant CRAC units available so that there are no breakdowns. The electrical systems are all dual-backed right the way through, with power feeds coming in from the grid. It's common to have dual inputs from two different power sources in case one fails."

ASHRAE standards

The American Society of Heating Refrigeration and Air Conditioning Engineers (ASHRAE) recently released new standards for IT equipment air supply. With guidelines gradually changing over the years, it's something that Siveter has a clear eye on.

"Stulz has worked with Barclays to install two large systems with its CRAC units."

"In the data centre, there is generally an increase in temperatures," he says. "ASHRAE has technical committees that look at these things, one of them is specifically concerned about data centres. A recent committee, which included most of the major equipment manufacturers - CISCO, DELL, IBM and HP - came up with a recommendation that has built in a very wide envelope, much broader than you would have had, say, 15 years ago, where they wanted a tight control on computer temperatures.

"The ASHRAE recommendation is now 18-27°, with a wide humidity band for the air temperature going into the servers. That means that the exhausts from the servers are now coming out at 30-35°."

As a dedicated supplier of cooling solutions for data centres, Stulz is in an ideal position to maintain the levels of resilience and security that financial groups rightly demand. With recent stories highlighting the fragility of certain IT infrastructures, those demands are more important now than ever before. ?

Stulz only supplies cooling solutions for data centres. That puts the firm in a good position to understand a market that is particular to financial institutions.
Derek Siveter, director at Stulz UK.