Faster Access to Clean Data is the New Basis for Competition

  • Neil Sandle, Head of Product Management at Alveo

  • 11.03.2022 01:30 pm
  • #data #basis

Today, financial service firms are experiencing a rapid growth in data volumes and data diversity. More content is available to feed into decision making. Conversely, more information and disclosures are requested by regulators and customers and investors too require more granular information faster. For example, many firms in this sector have ongoing pre-trade or post trade transparency requirements to fulfil.

Digitalisation is also generating huge amounts of data, as a by-product of business activities, often referred to as ‘digital exhaust, and through the use of innovative new techniques like natural language processing (NLP), can beused to gauge market sentiment and provide additional colour on financial markets. In addition, data is being used for a range of reasons - everything from regulatory compliance to enhanced insight into potential investments.

The availability of this data and the potential it provides,coupled with increasingly data intensive jobs and reporting requirements, means financial services firms need to improve their market data onboarding, operationalization, user access and analytics capabilities. 

Making good use of data is complex with firms needing to develop a list of companies or financial products they want to gather data from and decide on what information to collect. Once sourced, they need to know what data sets are available and highlight to business users the sources, when data was requested, what came back, and what quality checks were taken.

They also need know all the contextual information. For example, was the data disclosed directly, is it expert opinion or just opinions expressed online and who has permission to use it? This should make it easier to decide on what data they want to use. Furthermore, there are certain key processes data needs to go through before it can be fully tested. If the data is for operational purposes, firms need a data set that is high-quality and delivered reliably from a provider they can trust. As this is going to put it into an automated, day-to-day recurring process, firms need predictability around the availability and quality of the data.

If the data is for market research, the user may only want to use each data set once but they are likely to be more adventurous in finding new sets that give them a marketedge. However, the quality of the data and the ability to trust it is still crucial.

Existing approaches can often fall short

There are a range of drawbacks with existing approachesto market data management and analytics. IT is typically used to automate processes quickly, but this can lead to fossilized processes that are no longer fit for purpose. It means that financial and market analysts can be hardwired to specific datasets and formats when more flexibility is called for to quickly onboard new data sources.

Existing approaches often make is difficult to bring in new data sets because new data comes in different formats. Onboarding and operationalising data can be costly and time-consuming and, if users want to either bring in a new source or connect a new application or financial model, it can also be error prone.

Historically, market data collection, preparation and analytics have been different disciplines, separately managed and executed. That data needs to be copied and put into another database before another analyst can run arisk or investment model against it. It is time for a new approach, that materially shortens the turnaround time in bringing in new content and the provisioning of that content to business users.

Moving forward

The latest big data management tools can help a great deal in this context. They typically use cloud-native technology, so they are easily scalable depending on the intensity or volume of the data. Using cloud-based platforms can also give firms a more flexible way of paying for the resources they use. 

These tools are also able to facilitate the integration ofdata management and analytics, something which has proved to be difficult with legacy approaches. The use of underlying technologies like Cassandra and Spark makesit much easier to bring business logic or financial models to the data, streamlining the whole process and driving operational efficiencies.

In-memory data grids can be used to deliver fast response time to queries, and data integration and distribution technology can streamline the onboarding of new consuming business applications. Faster last mile integration will support faster and more informed decision-making and lower the cost of change in addressing new reporting requirements.

Establishing a Competitive Edge

Financial service firms should be looking maximise their data return on investment (RoI). The ‘know your data’ message is important here as firms need to know what they have, understand its lineage and track its distribution.

They must also ensure their stakeholders know what data is available and how easily they can access it. Ultimately, this is what will drive a firm’s competitive edge and the latest data management tool can make this a smootherprocess.

Other Blogs