Wolters Kluwer: Taming data duplication to support business transformation

Will Newcomer

VP - Risk & Performance at Wolters Kluwer

Views 272

Wolters Kluwer: Taming data duplication to support business transformation

15.03.2018 10:15 am

Thanks to CECL granular data requirements are here to stay. But how can firms tame the new data requirements by adapting their IT and data infrastructure? Will Newcomer, Vice President of Product & Strategy for Wolters Kluwer’s Finance, Risk & Reporting Americas business, explores the options.

The regulatory requirements that emerged from the firestorm that triggered the Great Recession put banks under near-relentless pressure to generate and report increasingly detailed layers of data. The Financial Accounting Standards Board’s new Current Expected Credit Loss (CECL) standard, which extends necessary loss calculations over the lifetime of loans, and signals from Federal Reserve chairman Powell that the stress testing and resolution planning components of Dodd-Frank will be maintained, show granular data requirements are here to stay. While the current administration’s focus on deregulation might bring some relief, it is unlikely to turn the tide, let alone relieve the data burdens bankers already face.

The most challenging data management burden is rooted in duplication. The evolution of regulations has left banks with various bespoke databases across five core functions: credit, treasury, profitability analytics, financial reporting and regulatory reporting, with the same data inevitably appearing and processed in multiple places.

This hodgepodge of bespoke marts simultaneously leads to both the duplication of data and processes, and the risk of inconsistencies -- which tend to rear their head at inopportune moments (i.e. when consistent data needs to be presented to regulators). For example, credit extracts core loan, customer and credit data; treasury pulls core cash flow data from all instruments; profitability departments pull the same instrument data as credit and treasury and add ledger information for allocations; financial reporting pulls ledgers and some subledgers for GAAP reporting; and regulatory reporting pulls the same data yet again to submit reports to regulators per prescribed templates. 

Relearning the data alphabet

Data storage has evolved from simple flat files to databases to marts to warehouses, and now to ‘lakes’ housed in ‘big data’ environments, where all kinds of structured and unstructured data is tended by data scientists. Proprietary calculating and reporting solutions designed for different requirements complicate the data requirements picture even further.

Just as the complexity of housing data has evolved, so have the data management tools. Data management is typically thought of in three stages: Extract, Transform and Load (ETL). Considering the multiple levels of staging tables between data sources and storage areas, most real-world data management processes consist of much more than three sequential steps (e.g. ELETL or ELELTL) -- and this is just to get the detailed data. Additional steps -- Calculation (C), Aggregation (A) and Presentation (P) -- are needed throughout to meet today’s analytical and reporting requirements.

When considered end to end, multiple occurrences of E, T, L, C, A and P are embedded in today’s data management processes – many of which are manual or semi-manual and still performed by senior management.

 

 

Banks should seize the moment

Fortunately for forward-thinking banks, the window before CECL kicks in provides a golden opportunity to transform the disparate data marts and processes underpinning key departments into a more integrated, future-proof approach benefiting not only compliance but also profitability and competitiveness.

The fact that CECL will effectively force this integration makes it even more important to begin the transformation now. The question is, how?

Some advocate ELT as the solution, but this exacerbates duplication in rules and loses the value of clean, normalized, persistent data. Others advocate throwing everything into a new data lake and letting the data scientists fish out what’s needed. Others sadly continue to build or buy point systems that involve separate databases and management processes, adding another island to the others. But reality shows there is no ‘one size fits all’ approach, no single data ocean or program. Tactical solutions to problems that are both immediate and strategic are not transformational; neither are old processes under new names.

The path to effectively transforming data management is to combine tried and true processes and solutions with the selective deployment of new technologies which remove undesirable duplication of both rules and storage.

A map of such an approach is below; ETL/ELT is applied to source data that is convened into staging tables or a data lake. Required and/or relevant data from the staging tables or lake is then transferred to a permanent, defined data mart where analysis, calculation and presentation can be conducted before the results are transmitted to various business functions and external recipients.

Diagram: A blueprint for the transformation of data management

By reducing the need for departments to pursue these processes independently, a more unified data management structure is created, minimizing duplication and redundancy and improving efficiency.

The result is that banks have more time to focus on core business goals. Better-managed data helps banks manage risk better, and build a clearer picture of customer behaviors. Data management, risk and finance professionals need to transform or replace their bespoke tools and processes to support this transformation. The institutions that use this brief period of regulatory ‘downtime’ to improve outdated and inefficient processes will be those in the best position going forward.

Note: Will Newcomer has more than 35 years of experience in risk and finance with major and regional banks as well as leading technology firms, making him uniquely qualified to lead clients to the forefront of integrated finance, risk and compliance solutions. In addition, Newcomer uses extensive experience in enterprise-wide management information systems to help financial institutions in the areas of risk adjusted performance management, budgeting and planning, asset and liability management, incentive compensation, financial reporting and stress testing. He will be speaking at next week’s CECL 2018 Congress in New York, organizing by The Center for Financial Professionals and sponsored by Wolters Kluwer

Latest blogs

Marina Avseeva Investment Holding

European Fintech Buzzing with VC Investment

Keeping up to speed with trends in technology takes time and effort, especially when it comes to financial technology or fintech as it is commonly referred to as. Money is streaming in to fintech, and everything from how we bank to how we make Read more »

John Lambert Mastercard

Drive to a Better Online Checkout Experience Starts Today

Calling for Industry Support of Online Shopping Technology Standards Vision to Move to Token-Only Storage of Payment Details Today, we take for granted how easy and convenient it is to use a card to pay for things in any of the 50 million stores Read more »

Bernhard Mors Mastercard

Mastercard at WTTC Global Summit: Partnering for Sustainable Tourism

Contributing more than 10 percent to global GDP, travel & tourism is one of the world’s largest industries. While the sector is characterized by its resilience, it faces significant challenges such as digitization, mobility across borders and Read more »

Julian Wallis Rambus

Mobile Scan-and-go Shopping: The Future of Retail is Now

As brick-and-mortar retailers look to enhance the in-store experience to compete with the ecommerce giants, they must contend with the unprecedented challenges posed by decreasing revenues and escalating costs. But for those who invest smartly in Read more »

Don Berga Avoka

Improving Banking Customer Experience Through the Application and Onboarding Process

Generally the most laborious part of a customer’s relationship with their bank is the application and onboarding process. Filling out countless forms, searching for long-lost documents, and signing each piece of paper multiple times can lead to Read more »

Related Blogs

Mike Salisbury Quantrix

Financial Data: Why Backend Infrastructure Matters

As banks and financial institutions drive to automate processes and ensure tighter integration, they are increasingly diverting their attention away from shiny, attention-grabbing customer-facing products. Instead, they are turning to the critical, Read more »

Ivy Schmerken FlexTrade

Alt Data on the March with Machine Learning

The explosion of alternative data sources, such as satellite images, sentiment analysis, and geolocation data, is having a profound impact on the field of quantitative investing. Analyzing torrents of unstructured data requires sophisticated tools Read more »

Ivy Schmerken FlexTrade

The Buy Side Delves into Mobile Data

As hedge funds and quantitative asset managers hunt for unique sources of alpha, Wall Street’s attention is turning to analysis of location data generated by mobile phones. Location data from mobile phones is now at the forefront of the push to Read more »

Ivy Schmerken FlexTrade

Discretionary Managers Seek Alpha in Alternative Data

Alternative data providers see huge potential in providing their data to discretionary asset managers who are losing assets to quantitative and systematic funds. As active managers trail the performance of passive index funds and exchange-traded Read more »

Andrew Joss EMEA

Knowledge Is Power: Why Data Is Key to Mergers and Acquisitions

There’s a lot of jostling at the top of the tech leaderboard as we move into 2017. Mergers and acquisitions (M&As) are a good way for companies to climb that board. They help businesses rapidly expand their skill set, customer base and revenue. Read more »

Magazine
ALL
Free Newsletter Sign-up
+44 (0) 208 819 32 53 +44 (0) 173 261 71 47
Download Our Mobile App