InterSystems: Banking on real-time data to remain compliant

  • Graeme Dillane, Manager, Financial services at InterSystems

  • 12.07.2018 01:00 pm
  • undisclosed

We talk about the ‘data revolution’ referring to the huge influx of data experienced over the past few years and the potential value it holds when analysed. However, many businesses – and banks in particular – are now trying to find new ways of unlocking the true potential of this data wave. 

There are many innovators within the industry who can see what needs to happen. For example, in a recent study by the Enterprise Strategy Group (ESG), commissioned by InterSystems, 38% said they had between 25 and 100 unique database instances, while another 20% had over 100. This was a general survey, not confined to the banking industry, but it does give some indication of the scale of the challenge. Typically, these multiple databases remain as silos.

In today’s industry, the need to become ever more competitive and grow revenue streams is encouraging a new approach to data and its analysis. This new approach promises greater insight into customers and the ability to devise new deals to suit their needs. However, over the past decade, the stream of new regulations has meant that banks are required to find new ways of handling and processing data.

With more regulations being implemented over the past 10 years, banks are working hard to identify a long-term vision. In some cases, a different application is being used for each new regulation instead, which can create many data silos. As a result, they have managed to meet the requirements, but in some cases the data may remain in many different pools.

The most recent revised Payment Services Directive (PSD2) could open up the banking industry entirely. It requires banks put an end to data silos, with integration enterprise wide, and the ability to analyse data in real-time.

To do this, they need a unified data platform that can integrate any number of silos, reaching out to disparate databases and silos, bringing the information back and making sense of it. This platform should be able to handle massive volumes of data, to scale up easily when these grow further and to absorb data from real-time activity, transactional activity and from document databases.

The platform must also have the agility to separate out the data needed. It must enable data to be interrogated even if it is in large data sets to enable the bank to comply with regulatory requirements such as answering unplanned, ad hoc questions from the regulators. Banks also face the challenge of new market entrants in the form of third parties who through API’s are authourised to access and transact on behalf of customers. Banks are required to provide this information upon request and will only be able to do so if they have a unified platform that can integrate the current siloes.

The advantage of making such an investment is that it will take the bank far beyond compliance. It will now have a secure, panoramic view of disparate data which can be used for distributed big data processing, predictive and real-time analytics and machine learning. Real-time and batch data can be analysed simultaneously at scale, allowing developers to embed analytic processing into business processes and transactional applications, enabling programmatic decisions based on real-time analysis.

Other Blogs