- 24.09.2021 09:45 am
- 11.08.2021 11:45 am
- 07.06.2021 06:15 pm
- 07.06.2021 05:00 pm
- 27.04.2020 03:00 pm
The financial services sector has faced huge crises – everything from the 2008 financial crisis to natural disasters. But while these upheavals presented significant challenges in the short-term, they taught valuable lessons on how to better cope with future uncertainty and prepare for the unknown.
Primarily, these crises highlighted the need to increase resilience through enhanced and improved data infrastructure. This is especially true of the pandemic, which has demonstrated how organisations must adapt and improve data management, with the focus firmly on integration and permission accessibility.
The pandemic has been a major global event, but it will not be the only challenge for banks to contend with. Disruptive events and sudden changes in trade flows are more common than many institutions like to remember. Aside from hurricanes and disease outbreaks, market bubbles and continuing tensions between major trading countries and blocs have the potential to overturn “normality” at any time.
In addition, a more sustained threat is emerging in the shape of fintechs and neobanks who have the potential to eat away at market share, especially in retail banking, where slick interfaces, ease of use, new products and lower costs are attractive to consumers and businesses alike.
For established financial institutions success will depend on how well and how quickly they respond to these sudden crises, while at the same time pushing forward more incremental change to out-compete or forge new collaborations with the fintechs.
Overcoming both sets of challenges will unavoidably require more advanced data management, especially in the large number of institutions that still struggle with complex legacy systems. Whether it is adapting to new crises, adopting or creating new applications, or building interfaces with partners, financial institutions must harmonise the masses of data they need. That requires them to combine external, unstructured data in various formats, with the data from banks’ own diverse systems. In a crisis, this needs to happen at speed. The collapse of Lehman Brothers in 2008 took almost everyone by surprise as did this year’s blockage of the Suez Canal which had such serious financial repercussions. When these events blow up like storms, leadership teams need fast access to an overarching view of their business and the ability to have accurate and detailed scenarios ready that give them a natural advantage when planning the next move.
This demands integration of different data layers in a single platform with applications on top of the vertical stack. The new architecture must be simple to provide the required level of agility to adapt to sudden changes.
It is also how established financial institutions will facilitate partnerships with neobanks and fintechs in the age of open banking. For these partnerships to work, the fintechs need access to the wealth of data about customer banking and financial behaviour. But that data must be usable and interoperable, or the partnership will never meet its objective.
Harmonised data enables new capabilities across the organisation
With this in mind, future-proofing a financial institution requires its systems to make data from all necessary sources available on demand and in a consistent and accurate format. This is the essential platform on which to build new capabilities that increase agility and provide new services to clients and customers, using advanced analytics, machine learning and application programming interfaces (APIs). New capabilities provide actionable insights that transform efficiency and increase resilience.
With access to fully harmonised data, business managers benefit from analytics and visualisations that quickly give them a deeper understanding of what is happening in the organisation. While an integration layer normalises the underlying data, machine learning enables dynamic queries and data analytics, along with API management capabilities.
Consistent, harmonised access to disparate data coupled with a wide palette of analytical capabilities enable banks to assess and address a wide range of risks, emerging threats or critical business initiatives. They have access to near real-time enterprise risk management, liquidity management, the ability to act on market signals, or to boost capital efficiency through faster and more accurate analysis of threats and opportunities.
Harmonised data is also the key that unlocks advances in artificial intelligence (AI) and machine learning (ML) that have the potential to create better, streamlined user experiences for clients and customers. Getting the data in shape for these technologies is a fundamental requirement and makes implementation less of a chasm to cross, especially if machine learning capabilities are integrated into the harmonisation platform that brings all the data layers together.
This is much easier than stitching together five separate layers and will help overcome the significant shortage in expertise when it comes to preparing data for AI and ML. It is entirely feasible to integrate data in ways that do not demand understanding of more complex analytical techniques.
Data harmonisation and integration have become prerequisites in the battle for the future of the banking industry. The future of risk and credit assessment, the ability to respond fast and effectively to major political, health or weather events, and the creation of new open banking partnerships all require institutions to integrate data from across extremely diverse systems. This data has the potential to build huge resilience and agility into banks, future-proofing them and enabling them to seize significant new opportunities. Before all that can happen, though, banks must ensure they have fully and effectively integrated it.
Get FinTech news headlines, videos, stories and product reviews on your mobile device. Download Financial IT App for Free