Where is All the Insight that Banks Were Promised?

  • Richard Price, Head of FSI, UK&I at TIBCO

  • 30.11.2022 11:45 am
  • #banks

Banks are not enjoying the returns they expected from all that they’ve invested in trying to drive value from data. The problem, to be absolutely clear, is not a lack of data. They’ve got a ton of that, with volumes continuing to rise. What they lack is meaningful insights of the kind that will drive actionable decision-making on the ground and give their businesses new energy and direction in a difficult economic and competitive climate. 

So what’s the issue? Why are these financial services institutions not getting the returns they were anticipating, given the large sums they’ve thrown at the problem? For one thing, much of their huge accumulation of data is still trapped in siloes, effectively unavailable outside of its particular line of business. Banks are still operating mainframes, data lakes and data warehouses. There’s also quite a bit of shadow IT that has been allowed to gather in the background, impeding clarity still further. And institutions are trying to leverage a confusing mix of different tools and methodologies adopted at different phases of digital evolution.

Banks have been told for years that everything should be moved into the cloud at the earliest convenience. But this hasn’t happened as expected, and results so far have in any case been mixed. Many are now working to a hybrid model with some computing power still in-house and some in the cloud, perhaps spread across multiple cloud platforms which can have the effect of confining data to separate siloes.

The financial services sector has been pondering numerous more contemporary approaches to data management, from microservices and APIs to data meshes, data fabrics and event-driven architectures. None of these are a bad idea, indeed all of them represent in one way or another the future of banking. Every one of these technologies has the potential to get them closer to where they need to be. The priority for banks is now to work out where all these strands of technology fit with their business goals. And many are running into a difficulty here.

It's time for a reset that focuses not so much on the speed of technology adoption but on the speed of business. It’s time for more emphasis on the consumption of data and less on its acquisition. Banks don’t need more dashboards and BI tools. They need better insights into the context of what they are actually trying to achieve as businesses. And they want that in real-time, not as the culmination of a convoluted process of searching endless haystacks of data to find the perfect needle. 

The focus must switch to one of thinking about what data is for and away from agonising about its origins. Never mind its lineage, what’s its current context? What was the actual purpose we wanted the data for? What business value do we want it to deliver? How can we use it for decisions in real-time? 

If we want to know what’s the best next step for a customer, we don’t want 400 questions that might or might not relate to that decision. This is where technology must augment and validate human decision-making. When humans need immediate answers, it’s no good if they are getting swamped with everything on that subject. You just need the means to assess the value of data to the point where you come to a conclusion that’s executable, and that drives insight for the business. You need enough information to be confident, which does not necessarily amount to perfection. Perfection kills progress. You need data that can actually be used and reused. Don’t get hung up on the road you take. Focus instead on the destination.

Banks need a partner who gets the problem, and who can leverage a range of methodologies. One that understands that the approach is always less important than the outcome. Every customer’s business is different, but they are all trying to get to the same thing, which is better and more relevant insights, faster. 

By assessing data value first, with a combination of active metadata management, banks can avoid common pitfalls, like opting to do nothing or finding themselves paralysed by trying to combine too many use cases in a kind of technology dating game. This will help solve three fundamental problems - the removal of technology complexity with a platform selection that serves multiple use cases and actors, disagreements over who funds a project based on outcomes, and the challenge of delivering efficient data engineering manpower so as to deliver value early. Do all this accurately, backed by C-level sponsorship, and the chance of a successful programme is immeasurably increased.

The European Bank for Reconstruction and Development (EBRD) is a good example of an organisation that had more data than it knew what to do with. Actionable business insights were not forthcoming. What the bank needed was a more holistic view of its data in the form of a platform that allowed it to scale, adapt, and remain resilient while it phased out older technologies and accelerated adoption of new ones. With TIBCO’s help it now has the tools it needs to support real business initiatives and objectives. People on the ground have a singular and holistic view of data, and the organisation, far from drowning in historic data, now has the ability to handle any future increases in traffic levels.

The lesson for the EBRD has been that not all data is equal. Just because you have a lot of it doesn’t mean you have to be using it all. It’s about knowing where to start in the quest for value.

Related Blogs

Other Blogs