Future preparations - benefits of utility-type data management

7 December 2016

Increased automation, declining profitability and a raft of post-crisis regulations have all changed the way financial institutions think about data. Future Banking asks Peter Moss, CEO for SmartStream Reference Data Utility, about establishing efficient data management practices and the benefits of utility-type services.

In his 25 years as a financial data professional, it’s fair to say that Peter Moss, CEO of SmartStream Reference Data Utility (RDU), has seen quite a lot of change. While financial markets have always been data driven and dependent on access to prices, reports and information about what’s happening around the world, the introduction of automation has fundamentally altered the industry and its relationship to data.

“20 years ago, you could walk on to a trading floor in any of the large wholesale banks and they would be heavily populated with hundreds if not thousands of people,” Moss says. “Now, in the very high-volume markets, all of that is pretty much done by computers, which means that the data needs to be more precise. A human is able to deal with grey in a much better way, but computers need black and white. That’s one of the primary things that has been driving the whole focus on the increasing quality of data. Detail now really matters.”

In the past, when financial firms found something wrong with the quality and detail of that data, Moss says they would respond by “throwing bodies” at the problem. “They would hire people, and actually build teams internally to clean the data up and make sure it was as good as it could possibly be,” he says.

After the financial crisis in 2007–08, however, things changed. The two key reasons for this, says Moss, are increased cost pressures and new regulations. “Previously, banks and other financial institutions were able to grow their revenues quite effectively,” Moss says. “But when the financial crisis hit, you had a situation where suddenly everybody had to do a bit of a reset. I don’t think it was obvious to everybody at the time that it was going to be so hard to rebuild the growth in revenues, but if you look historically over the past six or seven years, revenues in the capital markets have been largely flat.”

“You also have the amount of regulation being thrown at financial institutions. Everybody is being asked to report to the regulators on various activities within the capital markets. Those regulatory reports are built on a lot of data that is managed within financial institutions and, of course, if the data is wrong then the regulatory reports will be wrong as well. People are starting to realise that regulators are becoming far less tolerant of inaccurate reporting. So there has been a lot of money spent making make sure the regulations are applied correctly.”

View on operations

Put together, Moss says issues around profitability and compliance “affected the way a lot of the banks look at the way they operate”.

“They realised that they have all got fairly sizable teams that are populated with data management professionals where things could be done more effectively and efficiently,” he says. “Of course, automation still requires high-quality data, but the way in which you actually build that quality is starting to get quite a bit of scrutiny. Firms are at a point where they have reasonable data, a lot of the work is being done inside the organisation and they are starting to look more creatively at ways in which they can perhaps get the same results.”

So how are financial firms reassessing the efficiency of their data management practices? What are those “creative ways”? For Moss, utility-type services, where several banks share and pool their processing costs through a trusted third party, are key.

“There is a lot of focus on utilities generally in the industry, not just in the data management space, but in a whole range of areas,” he says. “I think the utility-type approach, where you take something that is being done in exactly the same way in every financial institution and lift it out so that it is done once for the industry, is starting to get a lot of traction. It’s very much a conversation that is happening at the C-suite, as executives look at ways in which they can improve their margins.”

Area of interest

One area Moss points at by way of example is the legal-entity space, in particular the stringent requirements around know your customer (KYC) and anti-money-laundering (AML) regulations. “This is an area where a lot of organisations are doing exactly the same thing,” Moss says. “Whenever they trade with someone, they have to go through a series of KYC and AML checks. So there is a large focus on the process of opening accounts and vetting the counterparty that you are going to be trading with.”

In an industry as networked as financial services, this is an inefficient way of operating, Moss adds. “All the different counterparties are trading with each other,” he says. “If three different brokers – A, B and C – are doing business with buy-side investment – firm D – then each of them has to vet the same institution before they can trade. They are doing this in exactly the same way and are therefore duplicating the work. So I think what is needed is a good data management utility that tackles these common problems on behalf of the industry, because there is no competitive advantage of doing it within your own organisation.”

A second example Moss offers is reference data management. In this case, firms trading financial products in the market are required to accurately define them with a set of reference data that includes the instrument, its characteristics, and how the product is traded and settled. “Again, every organisation is trading broadly the same set of instruments,” Moss says. “Particularly in liquid markets, there is a lot of value in actually getting that reference data right, but there is no reason why everybody should be doing the same thing themselves.”

As with KYC/AML regulations, Moss says utility-like services can avoid unnecessary duplication and help build a level of technical know-how for the wider industry to benefit from. “It is better to do it once and have everybody using the same definition because you are more likely to get consistent results as a consequence,” he says. “From the perspective of a utility, having one organisation focused on one thing means you can really build that level of expertise. What I think a data utility would typically try to do is become the specialist in that space, adopting the best practice from the industry and making sure that it is employing the best, most-qualified people. I think the trend in outsourcing generally is towards this kind of specialist-type provider.”

Time to take responsibility

Making the case for establishing new approaches to data has certainly got easier as more and more financial institutions introduce a culture of data responsibility within their businesses. “There’s a big focus on getting the culture right within the financial markets space,” Moss says. “You’ll see a lot of chief data officers now, particularly in the larger organisations. The quality, quantity and value you can get out of data is clearly something that is actively being discussed at the C-suite level.”

Given how busy the industry is set to be in the coming months and years, that can only be a good thing. In the short term, Moss says financial institutions will be bogged down by new regulations, including MiFID II, which comes into force in January 2018, and the Fundamental Review of the Trading Book (FRTB), also scheduled for implementation in 2018. In the long term, Moss adds, things will be “more interesting”, as new concepts like big data, data lakes and insight analytics attract even more focus and attention.

“I think this is where we are going to see many more valuable insights into much larger quantities of data,” he says. “Insights into behaviour and insights into the way markets collectively move. Quite where the value will come from remains to be seen, but from the way the internet industry has evolved over the past few years you can see that there are companies getting very good results from managing data well. Going forward, I think the financial markets will make sure they do everything in their power to catch up.”

Peter Moss is CEO of SmartStream Reference Data Utility (RDU), which is backed by Goldman Sachs, JPMorgan Chase and Morgan Stanley. RDU is a managed service that delivers complete, accurate and timely reference data for use in critical regulatory reporting, trade processing and risk management operations.