Wolters Kluwer: Taming data duplication to support business transformation

Will Newcomer

VP - Risk & Performance at Wolters Kluwer

Views 476

Wolters Kluwer: Taming data duplication to support business transformation

15.03.2018 10:15 am

Thanks to CECL granular data requirements are here to stay. But how can firms tame the new data requirements by adapting their IT and data infrastructure? Will Newcomer, Vice President of Product & Strategy for Wolters Kluwer’s Finance, Risk & Reporting Americas business, explores the options.

The regulatory requirements that emerged from the firestorm that triggered the Great Recession put banks under near-relentless pressure to generate and report increasingly detailed layers of data. The Financial Accounting Standards Board’s new Current Expected Credit Loss (CECL) standard, which extends necessary loss calculations over the lifetime of loans, and signals from Federal Reserve chairman Powell that the stress testing and resolution planning components of Dodd-Frank will be maintained, show granular data requirements are here to stay. While the current administration’s focus on deregulation might bring some relief, it is unlikely to turn the tide, let alone relieve the data burdens bankers already face.

The most challenging data management burden is rooted in duplication. The evolution of regulations has left banks with various bespoke databases across five core functions: credit, treasury, profitability analytics, financial reporting and regulatory reporting, with the same data inevitably appearing and processed in multiple places.

This hodgepodge of bespoke marts simultaneously leads to both the duplication of data and processes, and the risk of inconsistencies -- which tend to rear their head at inopportune moments (i.e. when consistent data needs to be presented to regulators). For example, credit extracts core loan, customer and credit data; treasury pulls core cash flow data from all instruments; profitability departments pull the same instrument data as credit and treasury and add ledger information for allocations; financial reporting pulls ledgers and some subledgers for GAAP reporting; and regulatory reporting pulls the same data yet again to submit reports to regulators per prescribed templates. 

Relearning the data alphabet

Data storage has evolved from simple flat files to databases to marts to warehouses, and now to ‘lakes’ housed in ‘big data’ environments, where all kinds of structured and unstructured data is tended by data scientists. Proprietary calculating and reporting solutions designed for different requirements complicate the data requirements picture even further.

Just as the complexity of housing data has evolved, so have the data management tools. Data management is typically thought of in three stages: Extract, Transform and Load (ETL). Considering the multiple levels of staging tables between data sources and storage areas, most real-world data management processes consist of much more than three sequential steps (e.g. ELETL or ELELTL) -- and this is just to get the detailed data. Additional steps -- Calculation (C), Aggregation (A) and Presentation (P) -- are needed throughout to meet today’s analytical and reporting requirements.

When considered end to end, multiple occurrences of E, T, L, C, A and P are embedded in today’s data management processes – many of which are manual or semi-manual and still performed by senior management.

 

 

Banks should seize the moment

Fortunately for forward-thinking banks, the window before CECL kicks in provides a golden opportunity to transform the disparate data marts and processes underpinning key departments into a more integrated, future-proof approach benefiting not only compliance but also profitability and competitiveness.

The fact that CECL will effectively force this integration makes it even more important to begin the transformation now. The question is, how?

Some advocate ELT as the solution, but this exacerbates duplication in rules and loses the value of clean, normalized, persistent data. Others advocate throwing everything into a new data lake and letting the data scientists fish out what’s needed. Others sadly continue to build or buy point systems that involve separate databases and management processes, adding another island to the others. But reality shows there is no ‘one size fits all’ approach, no single data ocean or program. Tactical solutions to problems that are both immediate and strategic are not transformational; neither are old processes under new names.

The path to effectively transforming data management is to combine tried and true processes and solutions with the selective deployment of new technologies which remove undesirable duplication of both rules and storage.

A map of such an approach is below; ETL/ELT is applied to source data that is convened into staging tables or a data lake. Required and/or relevant data from the staging tables or lake is then transferred to a permanent, defined data mart where analysis, calculation and presentation can be conducted before the results are transmitted to various business functions and external recipients.

Diagram: A blueprint for the transformation of data management

By reducing the need for departments to pursue these processes independently, a more unified data management structure is created, minimizing duplication and redundancy and improving efficiency.

The result is that banks have more time to focus on core business goals. Better-managed data helps banks manage risk better, and build a clearer picture of customer behaviors. Data management, risk and finance professionals need to transform or replace their bespoke tools and processes to support this transformation. The institutions that use this brief period of regulatory ‘downtime’ to improve outdated and inefficient processes will be those in the best position going forward.

Note: Will Newcomer has more than 35 years of experience in risk and finance with major and regional banks as well as leading technology firms, making him uniquely qualified to lead clients to the forefront of integrated finance, risk and compliance solutions. In addition, Newcomer uses extensive experience in enterprise-wide management information systems to help financial institutions in the areas of risk adjusted performance management, budgeting and planning, asset and liability management, incentive compensation, financial reporting and stress testing. He will be speaking at next week’s CECL 2018 Congress in New York, organizing by The Center for Financial Professionals and sponsored by Wolters Kluwer

Latest blogs

Jean-Paul Carbonnier CarboKinetic

Market Data, Reference Data and Blockchain

Blockchain, or distributed ledger technology (DLT), has the potential to disrupt a wide range of business models across the financial services industry. However, the potential application of blockchain to the worlds of market data and reference data Read more »

Matt Philips Diebold Nixdorf UK/I

Should we be scared of AI’s future potential?

Artificial intelligence (AI) is a technology which is admired, feared and — some would say — hyped. Certainly AI has come a long way since its beginnings in the 1940s, when computer scientists began to create algorithms that could, to some extent, Read more »

Thomas Rex Fingerprints

The future of cards, contactless and biometrics in payments

It's an interesting time for the humble payment card. Card payments have steadily risen in the last two decades, but innovation of the card has slowed since the launch of contactless over ten years ago. Until, that is, the recent entrance of the Read more »

Duena Blomstrom PeopleNotTech

Questioning Agile

I get that request all the time. People from all industries who have had no brush with the concept and who have seen some of my articles and can see my borderline-obsessive passion when it gets mentioned want me to provide a Cliff notes version to Read more »

N/A Red Deer

The hidden problems Europe uncovered during unbundling

This is the second in a three-part series of articles to help US asset and hedge fund managers answer their clients’ questions about the unbundling of payments for research and trading and understand what a best in class research management system Read more »

Related Blogs

Matt Hooper IMImobile

Police warn of 63% rise in SIM swap scams - response from industry expert

Now that banking on mobile devices is the norm, SIM swap fraud is becoming a growing concern across the industry. There is serious pressure on banks and mobile operators to address the issue before serious reputational damage is done; with the Read more »

Mark Hinds Polymatica

Increasing Data Return on Investment: Why It’s Time to Make the Data Vault More Accessible

As in almost every industry, those in the financial services sector hold swathes of data about every aspect of their organisation. In an ideal world, all departments within a  business should have easy access to this data to help guide intelligent Read more »

Mike Salisbury Quantrix

Financial Data: Why Backend Infrastructure Matters

As banks and financial institutions drive to automate processes and ensure tighter integration, they are increasingly diverting their attention away from shiny, attention-grabbing customer-facing products. Instead, they are turning to the critical, Read more »

Ivy Schmerken FlexTrade

Alt Data on the March with Machine Learning

The explosion of alternative data sources, such as satellite images, sentiment analysis, and geolocation data, is having a profound impact on the field of quantitative investing. Analyzing torrents of unstructured data requires sophisticated tools Read more »

Ivy Schmerken FlexTrade

The Buy Side Delves into Mobile Data

As hedge funds and quantitative asset managers hunt for unique sources of alpha, Wall Street’s attention is turning to analysis of location data generated by mobile phones. Location data from mobile phones is now at the forefront of the push to Read more »

Magazine
ALL
Free Newsletter Sign-up
+44 (0) 208 819 32 53 +44 (0) 173 261 71 47
Download Our Mobile App