Centre stage – optimising data


10 December 2015


At banks, the most important engine for driving growth is data: customer data to enhance sales; regulatory data to keep pace with increasingly prescriptive regulation; and market data, and the speed of its delivery, to vanquish trading rivals. Future Banking speaks to industry experts to find out how the market leaders are handling their load.


Optimal ways of storing, mining and delivering data are all crucial elements in a bank's data strategy, but any strategy would be etiolated without a proper plan of storing data effectively and efficiently. Hence, the emergence of the (omnipotent) data centre - and its location, cabling, network connectivity, flexibility and the technologies used within it - as the core instrument of a bank's operations.

Deploying data centre optimisation technologies like server virtualisation, more efficient air-cooling devices and installations for getting rid of hot air, intelligent infrastructure management devices and more powerful control consoles are also essential to ensure effective performance as data loads increase. This is increasingly important for also avoiding the animadversions of an increasingly influential 'green lobby' on the UK Government, and EU policy and thinking.

The UK Government has its CRC Energy Efficiency Scheme and the EU its own carbon trading scheme - Emissions Trading Scheme Phase II - all seeking to penalise inefficient data centres that use too much power and produce too much CO2. The fact is, of course, that data centres are very power hungry.

Outsourcing: an increasing trend

One recent trend in banks has been the optimisation of existing data centres or outsourcing the facility to external service providers that will run a data centre for a bank, either on a co-located basis, or via a more comprehensive cloud computing model with the vendor providing the entire web of IT equipment and network connectivity - applications are then merely pulled out of the cloud provided by the third party supplier.

David Young, director of change and innovation at Metro Bank, who is responsible for harnessing the power of technology to deliver outstanding service for Metro Bank's customers, is mindful that a bank must have a high degree of confidence in its data centre partners.

"We visited [our partners'] data centres and saw how the data is stored and managed," he says. "Other factors, such as energy usage and efficiency were taken into account, but we are simple folk at Metro Bank: our partners are good at running data centres in an optimised way and we are good at running a bank. So, we stick to our roles and have confidence in our partners."

Outsourcing of a bank's data centre, and associated services, is therefore eminently possible. Pricing will be a key factor determining where market participants, in particular, decide to place their trading engines. In an era of low equity trading volumes post-2008, latency, convenience and capacity - while still paramount - won't be the only considerations.

The cloud

Cloud computing is encouraging some commercial data centre providers to construct new facilities in the hope that 'if we build it, they will come'. Any move to a new data centre needs a robust migration strategy, plus a good cabling and switching design to provide superb connectivity. The correct mix of fibre and copper cabling also helps to reduce heat, maximise latency and improve efficiency.

An innovative approach is demonstrated by Nationwide. Daryl Brookes, head of IT service delivery, and David Bolt, head of data centre infrastructure and hosting, have assiduously weighed up the pros and cons of using cloud-based applications and, for them, the only solution is to use the cloud purely internally to provide greater flexibility.

"Protection of data is paramount so we have little appetite for risk. We always only use a private cloud run on our own data centre infrastructure. We do use some third-party services that reside in the cloud but no customer data ever goes on them," they say.
Using a private cloud gives Nationwide flexibility and agility. The bank is proud of its commitment to citizenship and sustainability. It retains control over its data centre to change the layout to optimise energy efficiency and run servers at higher temperatures in order to spend less on cooling.

New builds

Demands for increased power in banks' data centres is being driven by a plethora of innovative concepts like desktop virtualisation and algorithmic trading, as well as the new era of prescriptive regulation where banks must meet an insatiable demand for data from not just regulators but customers, business partners and indeed all stakeholders.

Greater power means greater responsibility, and more expenditure on cooling and quadrupling of electricity costs. Consequently, optimising data centre facilities to cope with increasing data loads while simultaneously containing electricity and running costs is therefore a key concern. This would be true for banks considering a new-build facility, co-location migration or overhaul in an existing facility.

For Mikael Munck, CIO at Saxo Bank, the priority for the bank's overall data strategy is developing cutting-edge trading technology for itself and other banks that buy its solutions. Underlying these trading solutions - and the bank's own online banking model - is Saxo Bank's cutting-edge data centres and their contribution to the bank's ongoing development of its fundamental business proposition.

"Data centres are crucial to us," says Munck. "We must have the latest technology to support our business. It is a big investment, but every once in a while you need to make that kind of investment." The renewal of its data centre infrastructure covers everything from the routers and servers to an upgrade of the networks that carry data.

Banks should become increasingly adept at specifying criteria in their data centre to help them attain efficiency - whether to use expensive intelligent infrastructure monitoring devices, for example, or water chillers, to locate in the far north to take advantage of 'free cooling' or use virtualised servers to reduce the number of servers and data centres required.

There are specialist advisers to help banks with determining these important design criteria.

Optimising existing data centres

Virtualisation of servers is possibly the biggest single step a bank can take to improve the efficiency of its data centres and needless to say the largest financial institutions have already done so. Virtualisation has played a key role in enabling financial institutions to consolidate their information and the number of data centres. Consolidating a bank's data centre estate across multiple countries - with different regulations, data protection acts and so on - is difficult but, thanks to virtualised servers, it is now possible to have two or three mega global data centres serving the regional needs of multinational banks.

The next most important step would be dealing with the aftermath of virtualisation as this generally raises temperatures significantly as usage rates - and therefore heat - increases for each virtualised server. It is working harder, so the cooling systems will be too. That is why adopting dynamic CRAC units that can respond to temperature changes, installing cold aisle containment and using good design layout is essential in virtualised facilities if power requirements and running costs are not to skyrocket. A 'green' data centre (efficient facility) is one that has been virtualised, but where other management software and design-led initiatives have been deployed to contain the heat generated.

The heat problem at virtualised data centres and the subsequent power/cooling challenge is only likely to get worse as future trends kick in such as the increasing adoption of desktop virtualisation (or 'thin' clients as they used to be called). If you take the hard drive away from the office or branch computer, and deliver applications and operating systems directly from the data centre, then you are adding another source of heat to the facility - compounding the cooling challenge.

What are the leading causes of downtime, data loss and hardware damage, and how does Barclays guard against them?
That was the question Future Banking asked Anthony Watson when he was CIO at Barclays. Anthony replied "85 to 90% of the leading causes of downtime - such as environmental, hardware/software malfunctions and human error for data loss - are usually preventable. We therefore deploy a hybrid strategy of best practice management and monitoring processes, coupled with the appropriate operational technology to support the efficiency of executing those processes. In addition, there are internal audit and compliance teams that aren't associated with technology, and who constantly review and assure that our processes, people and technology fulfil their purpose and reduce risk to the bank."

This emphasises how data centre optimisation is something that goes to the very core of a Barclays' business, and not an ephemeral aspect of the bank's operation.

Green data centres and carbon taxes

One may say that the 'green data centre' is an oxymoron given how much CO2 is emitted by these facilities. However, the 'green data centre' is a realistic goal for banks to pursue given how important the limiting of CO2 emissions has become in the pan-European regulatory agenda. Already, there is the UK CRC Energy Efficiency cap and trade scheme and its forerunner the EU ETS. These 'carbon tax' initiatives effectively create another reason for investing in 'green data centres'.

Carbon emissions are not only environmentally and reputationally damaging, they can adversely affect the bottom line. Large organisations, including the biggest banks with massive data centres, must now buy allowances for each ton of CO2 they emit and will, eventually, be placed in a league table that assesses their performance. Under the scheme, the money raised is redistributed with the best performers getting a rebate while the worst performers suffer fines or higher rates accordingly. Other so-called 'cap and trade' carbon tax initiatives that seek to use market forces to cut CO2 and increase efficiency are under way worldwide, with the US and Asia looking to follow Europe's lead.

For Munck, the key driver behind any data centre design project is the issue of energy consumption and finding ways to become more energy efficient. "The whole move from individual servers to virtualisation is a deliberate effort to reduce energy consumption," he says. "We are now up to 40% virtualisation. One of the largest costs of a data centre is energy. In the future, energy efficiency and cost control are likely to remain key."

Even for Young at Metro Bank, which outsources its data centres, "Energy usage and efficiency are taken into account" when assessing data centre providers he adds.

The right conclusions
The sheer explosion of data in financial services, caused by business and regulatory demands for more customer data on the retail banking side and high-frequency trading on the investment banking side, means that bank data centres are now major polluters - some even say that the sector will overtake the aviation industry - so cutting emissions is a key requirement. It's not for some spurious environmental or CSR reason either, but because improving efficiency cuts onerous carbon 'taxes', improves running costs and the total cost of ownership calculation for large data centres that are generally expected to last for ten years or more.

Whether a bank opts for a new-build data centre, or decides to optimise its existing one, or co-locate, or go into the cloud, it makes sense to ensure that it has a clear idea of its objectives and manages the migration without disruption to the business.