Banks now have to squeeze the maximum value from all of their data faster and more accurately than ever.
Not only must they compete with the initiatives of rivals, new demands are being placed on them from regulators and industry watchdogs eager for accurate and easily digestible reports and audits.
However, as data volumes balloon, severe problems are emerging with regard to the quality, consistency, provenance, duplication and accessibility of data in large enterprises such as banks, insurance companies and telecommunications providers.
Resolving these challenges using traditional methods diverts key staff and costs companies unnecessarily large amounts of time and budget, putting them at risk of losing out to competitors or failing to comply with regulatory requirements.
Tthe solution that is rapidly becoming established as roughly 50% to 75% faster, cheaper and more accurate, is the highly automated Metadata Ddriven Estate (MDE).
MDE’s high degree of automation minimises the need for costly/slow humans to write software. It accurately discovers and documents the “as is” movements and transformations of data that occur between the source systems (e.g. point of sale) through intermediate systems (e.g. data warehouse) then over to the business end users. The industry term for this is end to end Data Lineage, it is a requirement of BCBS 239 regulatory compliance. MDE also analyses how data is used by business end users and by applications such as marketing campaigns. This allows you to easily change the business rules that transform/re-purpose your data so that you can quickly deliver new functionality or add new data feeds in pace with business needs.
This highly advanced form of data-plumbing is enabling businesses to save millions of pounds by integrating faster, more accurately and more cheaply than ever, irrespective of technology, e.g. new hadoop data platforms, legacy mainframes, specialist Teradata appliances.
Meeting regulatory requirements
In banking, for example, the Metadata Driven Estate is allowing major institutions to meet the stringent requirements of regulations such as BCBS 239, where establishing data quality and provenance are essential. For example, this year a leading UK financial services institution has used MDE to reduce the total cost of ownership of their massive 2 petabyte data estate and deliver BCBS 239 data lineage and data governance, so improving competitiveness while achieving regulatory compliance, all in less than six months! Prior to this they had tried for several years but were defeated by the complexity and sheer volumes of their >500 million queries into >4,000 Databases from > 3,600 Users. MDE used automation to accurately process this complex estate, quickly and cheaply.
The MDE approach uses automation to finger-print the data back to its point of origin, enabling the regulator to access data that is wholly and demonstrably accurate and in compliant format, with the option of drilling down to check accuracy.
Whereas achieving this with conventional methods has been taking many months, or in some cases, the entire year, the MDE approach to achieve what is required for regulatory compliance reporting is delivered as a fixed fee four week sprint. It does not rely on you having subject matter experts or accurate documentation on hand, instead it reads the metadata that is left behind (footprints in the sand) by all IT systems as they process their day to day workload – so MDE gets its information “from the horses mouth”.
Separation of banking functions
Strategic re-organisation e.g. the separation of investment from retail and commercial banking to meet regulatory requirements is also placing huge demands on the ability of institutions to determine the provenance of data from thousands of databases, many of which will have been shared, formally or informally for years. MDE automates the un-picking process and accelerates the re-purposing of data to support the new organisation(s).
The massive degree of duplication within financial institutions, where some datasets are reproduced in different guises as many as 60 times, also adds to the complexity of achieving compliant separation. MDE identifies “safe duplicates” that can be deleted and queries automatically re-routed to the survivor instance.
The metadata-driven approach is again slashing the time and costs involved in these operations, in one case reducing the length of the project from a predicted six months to just two. No less importantly, the process of cleaning up the data and reducing duplication is enabling banks to reduce the size of their IT estates by as much as 30 per cent while at the same time having all the tools that allow regulators to see what they have done and how.
Revenue generation Use of MDE to generate revenue is also growing – most obviously when it cuts the time it takes to deploy multi-channel marketing campaigns that require highly-flexible and quickly-configurable data-hungry applications.
Without MDE for banks the sheer length of time that it takes to provision marketing applications can be such a drawback that it often negates all the benefits. Although the marketing application can be fully functional in hours, using traditional methods it can take 18 months to provision it with reliably accurate data that gives the company a single view of the customer. Hardly surprising when it commonly involves sorting through petabytes of data coming from 20 or 30 different databases.
However, MDE means provisioning can take place in weeks, rather than months, regardless of the technology involved, slashing the cost and time to make marketing initiatives a reality, allowing the organisation to build its market share with much greater momentum.
The speed, flexibility and reliability of MDE will transform the ability of any data-driven organisation to meet all its challenges in regulation and innovation.
Surely any technology that hugely improves business performance in such short time and at such reduced cost, really is a revolution in its own right?