Fast, flexible and predictive – how market surveillance must change

  • Nigel Farmer, Solutions Director, Capital Markets at Software AG

  • 12.10.2015 02:16 pm

Rate-fixing scandals, rogue traders and regulatory breaches – these are all potentially devastating threats to the reputation and bottom line of a bank or trading institution.

Each on its own is a good enough reason for any organisation operating in the markets to ensure that risk management is as taut and well-tuned as possible. The alternative is to court disaster through exposure to massive losses, drawn-out investigations and crippling fines.

To avoid these gloomy outcomes, banks have for several years been investing in surveillance technology in the hope of detecting risks so they can intervene immediately to prevent any damage.

Yet concentrating purely on the huge volumes of trading data – and achieving the major feat of processing it – is sad to say, insufficient.

To achieve total surveillance in today’s markets any institution must also be able to take trading information from all asset classes and cross-correlate it with staff emails, phone calls, social media, patterns of work, human resources records and even how many times traders leave the building for a smoke.

Anyone doubting the importance of analysing data from social media and other “unstructured” sources should remember how in both the Libor and FX-fixing scandals (that each cost banks £6 billion in fines), traders used online chat rooms for collusion.

Data from emails and voice calls can, for instance, indicate that oblique manipulation of prices or insider trading is taking place, as when a trader uses unauthorised information on a decision affecting oil prices to trade in the equities of major companies that rely heavily on fuel, such as airlines.

Equally, derivatives traders may collude to manipulate prices in underlying assets or currencies. In the Libor scandal, for example, Barclays was fined £290 million after its derivatives traders were found to have attempted to influence the rate to their advantage.

Surveillance of communications and social media will also, if correlated with market data, reveal whether all-important Chinese walls have been breached within an organisation.

And strange as it may seem, human resource records are also a useful resource in surveillance because many rogue traders, such as Jerome Kerviel who lost Societe Generale  €4 billion, are reluctant to go on holiday in case their activity is uncovered.

Even door sensor data can be pooled to show whether a trader is taking a lot of breaks. Experience has shown investigators there is a strong correlation between smoking breaks and unauthorised trading – on the basis that this is when insider information is shared.

The same surveillance technology that analyses these types of data will also monitor the algorithms that govern automated trading. Every bank wants to avoid a disaster such as the infamous Knight Capital loss of $440 million in 30 minutes, when its market-making software malfunctioned.

Having recognised that they must monitor all these data sources, trading institutions must give themselves the ability to predict dangers and to intervene to head them off.

This can only be achieved by availing themselves of the advances in analytics and in-memory technology that are now available.

In a well-designed platform, these technologies build a picture of what is “normal” within each organisation and then compare it with current activity to reveal risks in real time.

By establishing patterns, the data analysis will expose the activities of rogue individuals who will stand out if they deviate from what is normal in their peer group. This might manifest itself in trading in a different type of asset or the building of an unusually large position – the kind of activity that many rogue traders are adept at hiding.

In the same way, algorithms that deviate without cause and “go wild” can be more quickly detected, as can the external market risks that might knock them off course.

Achieving this in real time so that an institution can receive alerts and have processes in place to intervene and prevent harm, requires a sophisticated single platform using in-memory technology to cache the data and access it quickly for correlation.

It needs to be capable of processing big data very fast – handling as many as 20 million messages per second – so the bank is alerted to risks immediately.

Not only must it be rapid, it must be customisable, because every organisation in the markets has its own unique characteristics and trading patterns, which may change radically in line with a new business strategy. Each area of operations will also be subject to different regulations, often in different countries, requiring a constant process of updating in order to achieve compliance and optimal efficiency.

An off-the-shelf solution is very unlikely to achieve the necessary level of either performance or adaptability. Relying on six-monthly or yearly releases of software is a poor substitute for a platform that can be immediately adapted in-house with the minimum of disruption.

The adoption of this holistic and predictive approach to market surveillance will move a bank forward from mere compliance, to a position where it can act on risks before they have any substantial impact.

The technology’s adaptability to multiple asset classes and effectiveness in detecting fraud, money-laundering and pre-trade risk, makes it a cost-effective solution, enhanced by its accuracy in avoiding time-wasting false alarms.

In a market environment of increased regulatory scrutiny and huge fines, it is surely a technology that major institutions must embrace if they want to thrive, rather than merely survive.

Other Interviews