Streamline big data – Talend round table


10 December 2015


Big data, seen as the panacea to cure all banking ills, is capable of improving the customer experience, reducing risk, increasing efficiency and ultimately raising profits. But can it really achieve all this for the industry? Mark Brierley hears from experts across the sector at a Talend round table and tries to find a consensus on just what is and, more importantly, what is not possible.


A term that has spread across the business world like wildfire in the past decade, the finance industry has been one of many sectors to fan the flames and tout 'big data' as the future. For banks, it is seen as a means of providing a new level of personalisation to the customer, as giving a better understanding of risk management and as a way of improving fraud detection. But at a round table discussion organised by software vendor Talend, there is some scepticism on just how far the big industry players have gone in truly embracing big data.

A recent survey of banking professionals by Talend found that 64% of banks report that they are currently using big data. When asking senior executives specifically, that number rises to 75%, yet only 30% of professionals at the coal face do, suggesting a disconnect between what the banks say they are doing and what is actually happening. The headline figure of 64% was certainly taken with a pinch of salt among the participants at the round table.

"Pretty much all of our clients [in the banking industry] say 'yes we know that big data is the way to go', but it's very different when you get to the actual implentation and delivery of big data," Ben Musgrave, enterprise account manager for BIPB says to a chorus of approval from the other attendees, before confirming "there's far less happening in reality".

A clearer view

But the picture isn't perhaps as black and white as that. After all, the banking industry has been using big data for high-frequency trading and quant funds for more than a decade, so perhaps the detachment only exists between what banks say they are using big data for today and how it is actually being used. There is clearly little benefit to the personalisation of a customer's banking experience if high-frequency trading is as far as the usage currently goes. So perhaps the disconnect is more of a result of the stifling environment the sector has found itself in since 2007, thinks Louise Cooper, financial analyst and journalist for The Times, the Sun and the BBC.

There is clearly little benefit to the personalisation of a customer's banking experience if high-frequency trading is as far as the usage currently goes.

"It's worthwhile stepping back and looking at the banking sector; it's suffered a huge financial crisis and it hasn't fully recovered yet. It's only in the past six months in the UK that SMEs say they have access to credit; it has taken years for banks to recover," she explains. "Given the environment of increased regulation and political pressure, it doesn't surprise me that banks may say they're doing big data, when they're not. They have a lot of other priorities and have perhaps been a bit slow on big data."

For Nismish Shah, banking sector lead at Talend, this apparent delay stems more from an IT and infrastructure standpoint. "The issue for some of the larger banks is changing them from their legacy systems to the newer ones is going to be hard," he explains. "Challenger banks are the ones that are really ahead of the curve. The larger banks are saying they're doing it but I don't think they are in a big way."

That sentiment is supported by the survey results, which found 48% of industry professionals polled cite the limits of legacy systems as the number-one IT challenge facing the sector, with 43% also naming it as the main barrier to realising the benefits of big data.

"The analogy I like to use here is Britain's railways," explains Musgrave. "Britain was the first in the world to build its railways in the Victorian period, but now we're stuck with this old, very expensive legacy railway system. How do you improve that when you're spending all your money on just maintaining it? Whereas, you go to China and all these high-speed rail lines are being built because there was no legacy there to overcome."

Challenger banks can start with a clean slate and invest in the right infrastructure for big data from the outset.

Legacy virus

The problem stems from the way in which legacy systems store and use their data. By keeping data sets in separate silos, which serve only one purpose, it is hard for banks to create a hub into which all data can be poured and the full potential of big data analysis can be realised.

"They're very siloed; they're siloed by business line; they're siloed by region; they're siloed by acquisition - so they can't react quickly," agrees Musgrave. "The problem with legacy IT systems is that doing any upgrades on them is incredibly difficult... so how do you expect them to keep up with the challengers?"

By keeping data sets in separate silos, which serve only one purpose, it is hard for banks to create a hub into which all data can be poured and the full potential of big data analysis can be realised.

There is evidence that banks are starting to realise this and act upon it. Session chair, Professer Patrick Wolfe, professor of statistics at UCL's Big Data Institute notes that "Big data right now internally is an easy sell", with the survey finding that 86% of respondents are in agreement that banks do understand the potential power it could bring to the business.

"I think the banks are coming; this is where the investment is going to come next," agrees Shah. But given the relative immaturity of the industry's understanding of big data, there are concerns about how well banks would be able to use a big-data platform, once installed.

"We're still at the stage where we're amazed by the sheer volume and availability of all these different types of data sets," explains Wolfe. "We're trying desperately to get our arms around it, but we haven't quite yet moved to the place where we want to be, further upstream, where it's not the amount of data that matters, it's how much information you can get out. Of course, adding more data can always help but it can also just be adding more noise."

And this will be crucial to the future success of big data in banking; the quality of the data is just as important as the data itself. Cooper calls on the example of financial forecasting to explain: "There is something like a million data points in US economics but financial forecasting is often quite inaccurate. So why, if big data is so brilliant, doesn't it help us make sense of the financial and economic world?"

The old adage, too many cooks spoil the broth is never more true than in this instance. If the implementation of big data platforms is just a conduit for the pursuit of more data, banks are unlikely to achieve their started aims of improving customer personalisation, risk management and fraud detection. What really matters is the quality of data being put into the system.

Still, Wolfe concludes on a brighter note, saying that like most new innovations, it is just a matter of time before banks learn how to get the best out of their data. "In a few years time, I expect us to have a better understanding of big data and we'll have better-managed expectations about the art of the possible," he says. "There are fundamental limits. If you add another data set that doesn't really add any information, all you've done is add noise; so more is better, but only to a point and only if you can exploit it. Right now, we're not quite ready to believe that."

The scepticism with which the round table started hadn't quite been doused by its conclusion, it seems.


Round-table participants
Patrick Wolfe:
professor of statistics and honorary professor of computer science at University College London, where he is a member of the department's senior management team and a Royal Society and EPSRC Established Career Research Fellow in the mathematical sciences.

Roger Maull: professor of management systems in the University of Surrey Business School and a founder member of Surrey's Centre for the Digital Economy. His latest research considers the implications of the digital revolution for businesses and society, exploring the change driven by technologies such as the internet of things, and the advent of personal data.

Louise Cooper: chartered financial analyst, writer, Times financial columnist, broadcaster and reporter on economics and market activity for senior economists and policymakers. Cooper is well known for her work on the BBC World Service as a presenter and as senior economics journalist for shows including Newshour and Europe Today.

Nimish Shah: account executive at Talend, focused on helping clients across the banking and financial services sector gain competitive advantage through big-data insights, with cost-effective, easy-to-use, readily adaptable and extremely versatile solutions.

Ben Musgrave: enterprise account manager at BIPB, a specialist business intelligence consultancy with an established and growing global presence, serving clients in the UK, US, South Africa, Singapore and France.