Security Experts:

Financial Regulator's Algorithm Compliance Concerns Are Relevant to All Businesses

The UK's financial regulator, the Financial Conduct Authority (FCA), issued a report Monday warning financial companies that it would be looking closely at so-called 'algo trading': "Algorithmic Trading Compliance in Wholesale Markets" (PDF).

Algo (or algorithmic) trading is the use of computer algorithms to buy or sell stock automatically and at speed if certain market conditions are met. The danger is that rapid trading by computers can change the market causing more buying or selling before human traders can intervene and correct the situation. Such algo trading has been blamed as partly responsible for this month's Wall Street sell-off that led to a 4% fall in Standard & Poor's 500-stock index last Monday -- the worst decline since August 2011.

David Murray, Corvil's chief marketing and business development officer, explains the problem. "It takes a person 300-400 milliseconds (thousandths of a second) to blink, and computers can execute a trade in 30-40 microseconds (millionths of a second) -- so it is clear that the new reality of time in an algorithmic world mandates new oversight and controls."

In its new report, compiled in the months preceding last week's Wall Street sell-off, the FCA warns, "In the absence of appropriate systems and controls, the increased speed and complexity of financial markets can turn otherwise manageable errors into extreme events with potentially wide-spread implications." Because of this, it adds, "We will continue to assess whether firms have taken sufficient steps to reduce risks arising from algorithmic trading."

Five key compliance areas are highlighted by the FCA: a full understanding and management of algorithms across the business; robust development and testing processes for algorithms; pre and post trade risk controls; an effective governance and oversight framework; and the ability to monitor for potential conduct issues and thereby reduce market abuse risks.

This isn't just about automated trading with the potential to wobble global financial markets -- it is also about localized and criminal abuse of algorithms. In November 2017, the FCA fined Paul Axel Walter -- subsequently known as 'algo-baiter' -- £60,090 for market abuse via algorithms. Walter was a senior bond trader, working at Bank of America Merrill Lynch (BAML). In 2014, he entered bids into the system that reflected the opposite of his intention. The algorithms reacted to his bids allowing him to subsequently enter his true bids into a market that he had manipulated.

But the issues go beyond just financial trading. "Similar conditions exist not only across global financial markets," explains Murray. "There are similar risks for other algorithmic businesses and use of artificial intelligence."

With the digitization and computer-based automation of all industry, the problems currently highlighted in the financial sector will become an issue for businesses generally. Actions will be triggered by and acted upon by unseen algorithms hidden within the system. It already happens within security products, where decisions can be made without anyone really understanding how or why they were reached. At the same time, outsiders will be able to manipulate the algorithms by feeding them false information, similar to Walter's manipulation of the trading algorithms.

The FCA's five principles for algo compliance are applicable far beyond just financial institutions. Compliance officers and security teams will need to understand their use of algorithms within machine learning and artificial intelligence systems to remain within compliance and defeat both internal and external malicious actors. Key, perhaps, is the second principle: robust development and testing processes. This is particularly relevant where a business develops its own algorithms -- as is common in the financial industry -- rather than relying, blindly, on externally developed algorithms.

Algorithm development is subject to the same pressures as any other software development -- the need to get it complete and operational as quickly as possible. The FCA warns against development procedures that focus on operational effectiveness without considering other issues. An example outside of finance could be automated customer or user profiling without considering the impact of the General Data Protection Regulation (GDPR). Article 22 states, "The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her."

The FCA's advice is good for all software development: "a culture of open communication between different business units, while having a clear separation of roles and independent reviews... by having a separate team that verifies and checks the output and quality of code."

As the algorithms get more complex, they get more difficult to control. "There's often a tradeoff between model or algorithm performance and complexity," explains Endgame's technical director of data science, Hyrum Anderson, "with higher performing models often requiring more model mass. Examples include: more trees in random forest or gradient boosting models, more layers in convolutional neural networks, etc. As a design principal, experienced machine learning researchers try to utilize the principle of Occam's razor -- when many models have similar performance, choose the simpler one."

But he also warns that while simplicity aids in human understanding and verification, and prevents models from making extreme predictions, it also potentially creates the best conditions for adversaries to fool them. While DevOps may be good for software development, DevSecOps would be better for algorithm development to ensure the most secure and reliable outcome.

A second of the FCA's five principles is also relevant to compliance and security teams beyond just the financial industry: the ability to monitor for potential conduct issues. Two aspects of this requirement are particularly relevant: network monitoring for signs of abuse or misuse; and algorithm testing standards and procedures.

The first will become increasingly challenging. Security teams already monitor their networks for anomalous events; but they use algorithms to do so. As algorithmic automation increases throughout industry, security teams will need to find monitoring methods to monitor even the algorithms they use for monitoring other aspects of the business. They will need to be able to detect malicious external actors attempting to subvert the algorithms, and insiders attempting to manipulate the algorithms. This is of course particularly concerning in the financial sector where entire markets, and potentially national economies, could be manipulated for criminal gain -- or individual company share prices manipulated in sophisticated versions of pump and dump schemes.

Corvil's Murray summarizes the problem. "To operate in today’s machine time environments and enable rapid, secure, compliant time to market, businesses require process controls as well as layered technology oversight to assure precision and accuracy of time stamping to establish sequencing, continuous capture and of all electronic business activity, real-time analysis of transactions, and anomaly detection for cyber and abuse surveillance."

Testing the veracity of algorithms will also be a problem. The third-party anti-malware testing industry is struggling to find methods of adequately and objectively testing algo-based endpoint protection systems. As companies begin to develop their own algorithms for their own automation purposes, testing will likely fall on the very people who developed the algorithms. Objectivity may be impossible, and testing may not be effective.

The FCA's algorithmic trading compliance report should be a clarion call for all businesses. The new and emerging world of artificial intelligence -- that is, algorithms -- promises huge benefits for industry in increased speeds and lower costs; just as it does in the financial markets. But whether industry generally has fully examined the security and compliance issues that algorithms bring with them is a separate but urgent question. Algorithmic Trading Compliance in Wholesale Markets is a good starting point.

Related: The Role of Artificial Intelligence in Cyber Security 

Related: Privacy Fears Over Artificial Intelligence as Crimestopper 

Related: Using Machine Learning for Red Team Vs Blue Team Wargames 

Related: Corvil Launches Automated Security Tool for Financial Exchanges 

view counter
Kevin Townsend is a Senior Contributor at SecurityWeek. He has been writing about high tech issues since before the birth of Microsoft. For the last 15 years he has specialized in information security; and has had many thousands of articles published in dozens of different magazines – from The Times and the Financial Times to current and long-gone computer magazines.