Expertise | Trust | Leadership

Market Surveillance – Lessons from Failures

May 2024,

Dear clients and friends,

A recent market watch newsletter (Market Watch 79) saw the FCA return to the topic of market abuse surveillance system and their effective use.

What were we reminded of?

In brief, a firm must have effective arrangements to detect and report suspicious activity. It is up to each firm to determine what is effective and proportionate for their business. The monitoring and surveillance systems that firms use must be subject to the necessary governance, testing, reviews, and rectification of issues – an assumption that the software is working perfectly is not sufficient.

What are the FCA’s key observations?

The failure of alerts produced by surveillance systems caused by faulty implementation, bugs creeping into the system, or data not being ingested properly or at all has resulted in:

  • entire sections of a firm’s activity not being monitored;
  • alerts being generated but not in all cases where it would be expected; and
  • no alerts being generated at all because of poor initial testing at and after implementation.

Examples that the FCA point to include:

  • – a testing failure meant that it was not identified that the news feed had not been activated. The news feed was needed for the surveillance system to generate alerts. It took the firm three years to realise and only then because of an FCA STOR enquiry; and
  • due to a mistake at the coding stage, another surveillance system failed to generate alerts, because it required a trade in the relevant instrument on the day that the price moved – in illiquid instruments this created a risk of potential insider dealing going undetected. The error was undetected for several years and was hidden because the system continued to generate alerts in reasonable numbers.

Most firms reviewed by the FCA did, however, have formal procedures in place describing the frequency of testing and which elements of models were subject to review. As such, we’d advise implementing similar procedures where these are not already in place. Furthermore, most firms undertook an annual test of some type – including parameter calibration, model logic, model code and data (comprehensiveness and accuracy).

In Conclusion…

The effective use of surveillance really comes down to three principles for the FCA:

  1. Data governance – is all relevant trade and order data being captured? Is ownership of data clearly defined? Are measures in place to regularly conduct checks and identify issues?
  2. Model Testing and Validation – are governance arrangements around model testing sufficiently robust? How frequently is testing occurring? Does the testing include back testing historical data and/or conducting scenario-based testing to ensure that the system accurately detects potential market abuse?
  3. Model implementation and amendment – what form of testing is undertaken before introducing new surveillance models or amendments to models? How quickly is action taken to implement, modify, recalibrate, and fix surveillance models? Is regression testing undertaken when changes are made to other systems that might adversely affect surveillance systems?

What now?

The FCA note that the governance of surveillance systems has been lacking in some cases and expect the firms they regulate to proactively guard against surveillance failures. Do not think that the implementation of the system and alert generation means that the job is done; it is just the start.

So, reviewing the surveillance systems under the three headings above would be time well spent to mitigate against the risks the FCA have identified in their work – especially so when there hasn’t been testing since implementation, change of ownership or data feeds.

Please contact your Judd consultants or should you wish to discuss further.



This site uses cookies.

We use one or two cookies to take care of security and a few non-personal cookies for analytics. If you click ‘Accept’ this will allow us to use Analytics cookies.