IBM: banks - agents of data
In September, IBM hosted a dinner for senior bankers at the Royal Exchange in London. Representing IBM were Vivek Bajaj, director global banking and financial markets, IBM Big Data (right); Graham Cobb, European banking and financial markets lead, IBM Big Data; Laurence Trigwell, global client executive, HSBC; and Lauren Walker, UK and Ireland big data and analytics practice leader.
Thus Chris Skinner, chairman of the Financial Services Club, began Future Banking's September dinner for senior bankers in London, UK. He went on to outline the relevance of big data to banking. Banks have moved from a physical sphere to a remote sphere, and from being protectors of cash in branches (as that's where the money was) to being protectors of data online (as that's where the money is now). But, Skinner continued, it's not just about looking after that data; it's about using it in three forms: for revenue, risk and advice.
Speaking on behalf of IBM, director global banking and financial markets Vivek Bajaj made a keynote address explaining how big data isn't an objective in and of itself. Rather, its value lies in the application of big data and analytics techniques with the aim of significantly improving business outcomes. Using his experience of working with bank CXOs in all major markets, he described how market leaders were using big data and analytics to drive revenue, reduce risk and improve operations. Revenue is driven by the leveraging of transactional and interaction data to personalise client experiences; in the risk domain, big data is used to improve fraud detection and investigation accuracy as well as speed, saving companies millions; and from an operations perspective, big data is all about being more predictive in running your business to minimise downtime and maximise client satisfaction.
Harnessing big data
Through lively contributions over dinner from the assembled dinner guests, several important themes emerged, chiefly that big data is about far more than just data analysis; it is about data leverage, provision, sourcing and mining. It is contextual, location based, predictive, proactive and personal. Data can and should be used by banks to compete. But if data is to be a new battleground, many warriors don't appear to be fighting. Perhaps they are simply reluctant to use it, being over-concerned about internal structures rather than external battles.
Another important theme to emerge was that banks that could take their back-end risk and transaction engines and integrate them with data-leveraged front-end tools would gain serious competitive advantage. But, as one dinner guest said, there are no 'big data projects', just projects for solving business problems. And there lies the rub. Technology is only as good as it is programmed to be; if it is programmed to address the wrong question, it is nigh on useless. Hence the need to always ask the right questions of data.
According to research group Forrester, only 12% of data is mined effectively. Dinner guests were invited to reflect on what could result if a bank's whole internal data were utilised or, even better, if internal data flows were combined with external flows of social media and news. What might happen if every customer movement could be detected and analysed in real time for, say, AML or counter-fraud purposes, or if every customer footfall could be identified for a potential sale or service? It's a dream for some but a reality for only a few, mainly because of internal battles in banks over digitisation.
An important secondary issue here is the degree of customers' ease with issues arising from digitisation. That many bank customers simply do not read the T&Cs of contracts they accept online is well known. What might happen if they suffered losses as a result? The bank would hardly make itself popular by suggesting that the losses were the customer's fault.
The evening progressed into a debate about 'analysis paralysis': whether banks should really bother harvesting all this data, and how to distinguish the relevant from the redundant.
The discussion ended with consideration of the big data needs of regulators, and of the regulatory demands that will force the industry towards more real-time reporting of trades, transactions and flows. It was felt that, right now, regulators might have the ability and desire but not the budgets or legislation-derived powers to accomplish the degree of data drilling that is necessary for it to be fully effective.
A convivial evening ended with broad agreement that data in financial services was critical, but the challenge was to know what data was critical to whom.
Christopher Watts, co-publisher of Future Banking, acted as moderator.