Essential resources – finding the perfect data centre


7 July 2016

The past few years have been interesting times for the financial services data centre market. While spending is growing relatively slowly across the board (Gartner places the growth rate at around 2.3% a year), the rise of the cloud is stirring demand and companies are becoming more inventive in their strategies as their data architecture becomes more complex. While many banks continue to outsource, others now see running their own data centres as a competitive advantage.

UBS is one example, paying £28 million to acquire a 150,000ft2 data centre in Hayes, West London last December as its first dedicated UK facility. Capital One is another to deliberately shift away from third-party facilities, opening its own $150-million US data centre outside Richmond, Virginia, in March 2014. Faster implementation of digital banking enhancements is one key driver.

"The objective we set is that we never want the infrastructure to slow down the pace at which our agile teams can operate," says Capital One's CIO, Rob Alexander.

However, the key concerns remain information security and guarding against downtime, data loss and hardware damage. That means distributed locations that can back each other up in the event of an outage are essential.

SpareBank 1, Norway's second-largest bank alliance, had previously split its systems between two Oslo facilities, separated by a single street. It had to rethink its plans after a close call in 2011 when a car bomb exploded in Oslo's government quarter.

As the bomb could have taken out both its data centres, SpareBank 1 began work on a third facility in its existing Oslo office buildings. "We were very lucky to find the 84m2 of space to use for this purpose," said Nora Midtsund, SpareBank's IT project manager at the time.

The two Belgian centres BNP Paribas began building in 2013 at Vaux-sur-Sure and Bastogne illustrate in-house data centre priorities well: sufficient installed electric power capacity (30MW at each site), easy access to the motorway network, multiple telecommunications links and a stable security environment. A western European location like Belgium ticks all those boxes, with a reliable electricity grid and a choice of data carriers. BNP is also focusing on energy efficiency, with its new centres employing 'free cooling'. This makes maximum use of external cooler air to regulate the building and equipment temperature, improving energy performance.

"One of the basic requirements in our planning was the need for high energy efficiency and the use of green technology in these Generation 3+ data centres," says Jacques Godet, head of technology, operations and property services at BNP Paribas Fortis. "The buildings are therefore being tailored to the new generation of high-density servers. By using innovative technology, we will be able to attain an energy efficiency rating of 1.3, a 30% improvement on current performance."

Power usage effectiveness (PUE) is the most commonly employed metric for comparison, generated by dividing the total amount of energy used by a facility by the amount consumed by IT equipment. A score of 1.0 is considered to be ideal. In the UK, new data centres typically target a PUE between 1.5 and 1.2, with the most efficient designs aiming for 1.15 or less.

A game of risk

BNP has also signed up to use Qarnot Computing's cloud platform to perform risk calculations. The difference here is that Qarnot's servers are distributed across buildings within Paris, using otherwise wasted heat to warm homes, schools and offices. This cuts the carbon footprint by 75%.

Laurence Pessez, BNP's corporate social responsibility officer says, "We are proud to be the first French bank to engage alongside Qarnot Computing. Using this platform; BNP Paribas is thus associated with a social enterprise in the area of green IT."

The big technical data centre trend continues the shift to the cloud, driven by cost and flexibility benefits. In 2014, Leeds Building Society allied with Yorkshire Building Society to move their core mortgage and savings applications processes into a virtual private cloud running at a HP-managed UK data centre. Three other UK building societies - Ipswich, Loughborough and Dudley - have signed a similar deal with Unisys.

Flexible 'virtualisation' software is enabling this shift, letting banks' computing infrastructure run on commodity hardware across multiple locations. But rather than relying on a giant single cloud system as the likes of Facebook and Netflix do, virtualisation allows banks to adopt a hybrid model.

This way, they can spread processing across private data centres and on-demand services from public cloud providers like Amazon and Google. This delivers rapid scalability, permits economies of scale and can cut energy use.

Fidelity Investments' state-of-the-art data centre in Nebraska is a good example, containing thousands of x86 servers but using 40% less energy than the company's previous data centres. "It's a very open design that can evolve as we decide to
add capacity in the future," says Eric Wells, Fidelity's VP of data centre services.

Fidelity's Centercore design is built from 500kW or 1MW 'CoreUnits' - steel-frame, one-storey rooms. Manufactured off site, they are equipped with power connections and cabling on arrival at location, and then assembled like LEGO blocks. An entire data centre takes only six months to build.

Bank of America is also exploring this option, using standard 20ft shipping containers packed full of servers, storage, networking and cooling equipment to support private clouds at its new data centres. However, there's still a great deal of hesitancy in cloud adoption with many banks retaining traditional on-premise mainframe systems. A big contributor to that reluctance is data security - an overriding concern for financial services.

Even David Reilly, Bank of America's global technology infrastructure executive, doesn't think public cloud services are a viable near-term option for much of his bank's computing needs. The security isn't sufficient, he says, and the economics "are not compelling yet".

To protect and serve

Tougher data protection rules also prevent banks relying entirely on public clouds. Though across-the-board encryption does provide a public cloud security solution, safe harbour regulations dictate banks must know exactly where their customers' data is at all times.

A German bank, for instance, might mandate that its data is stored only in Germany. Using only a private cloud lets it guarantee that customer data won't leave the country.

UK banking start-up Metro Bank has been cloud-based from the beginning and has no qualms over running core banking and other business applications on third-party cloud systems.

"We have no servers at all in our stores or in our head office," says David Young, director of change and innovation at Metro Bank. "Microsoft applications like Office 365 and our CRM suite are run in Microsoft's cloud, and we are comfortable with that."

In the UK, Metro Bank uses two co-located data centres, each of which can run the bank independently. "The risk of failure in the data centre is, in extremis, the loss of all IT capability," says Young. "Everything is replicated twofold or fourfold to mitigate the risk in each of those data centres. The cloud is a critical component of what we do."

This kind of outsourced approach remains popular in the industry. For example, Barclays has closed numerous in-house data centres as part of its Transform programme and Deutsche Bank recently announced it will outsource much of its wholesale banking division's IT infrastructure to HP in a ten-year multibillion-dollar deal.

One key co-location attraction is connectivity: access to a wider range of data links via multiple telecoms carriers. Third-party data centres' role as communications hubs has emerged as ecommerce and digital channels force banks to handle more digital traffic. It's almost impossible for banks building their own data centres to replicate this, typically only using one or two carriers as primary providers.

Co-location specialists such as Equinix and Interxion say their extra connectivity ensures redundancy and lowers costs. Belgian insurer Fidea is a good example, working with Interxion and IT partner Tyneso to build a new private cloud across various European data centre locations.

"Our network performance and neutrality remain assured," says Koen Van Bogaert, solution support architect at Fidea. "With Interxion, we can choose from more than 80 leading national and international carriers, internet service providers and content distribution networks."

Employing Tyneso's network also means flexibility and scalability. "Should we need to urgently increase our storage capacity or internet connectivity, all we need to do is place an order with Tyneso," explains Van Bogaert. "This means we don't have to go through a whole new purchasing procedure every time. Because both companies work with specialists, we don't have to worry about the network, security, the power supply and cooling, or backups."

As digital channels proliferate, data volumes balloon and the use of techniques like real-time analytics grows, banks require larger amounts of computing power and storage to handle business as usual. To cope, they will build more in-house data centres and also continue to work with third-party providers.

The aim is to build the widest web of reliable systems and communications that also offers flexibility and best value. It's a tall order - but one that modern technology and new security standards are making possible.