The events surrounding Knight Capital’s recent $440m loss, coming so soon after the 2010 Flash Crash and the BATS and Facebook IPO disasters, have re-opened the debate on whether the ever increasing automation of the financial industry is desirable, or even safe.
This is because, in truth, all of these scenarios could have been much worse (for the financial industry as a whole, rather than the companies involved). Imagine if the market hadn’t rebounded in 2010? Or the ‘glitch’ suffered by Knight had taken place instead at a systemically important institution that would have required a taxpayer bailout?
Everyone seems to be in agreement that it’s a scary situation where a technological glitch or a small mistake in coding can lead to a major market maker like Knight being almost ruined in the space of 45 minutes. What is less clear is what the industry can do to stop events repeating themselves.
SEC accelerating rules
Chairman of the Securities Exchange Commission (SEC), Mary Schapiro, has responded by asking the SEC staff “to accelerate ongoing efforts to propose a rule to require exchanges and other market centres to have specific programs in place to ensure the capacity and integrity of their systems”.
The SEC has also announced plans to host a technology round table next month to explore different ways to promote stability in markets that are reliant on highly automated systems.
While it is good to see the SEC taking a thoughtful approach to this issue, this is unlikely to do much to assuage investor’s confidence in the short term.
Perhaps at this round table the SEC will consider making Automation Review Policies (ARPs) mandatory, which the US Government Accountability Office recommended that the Commission should do in 2004 and Schapiro argued in favour of in March 2011.
ARPs would require firms to have appropriate technology in place and to assure its functionality via regular capacity planning and testing exercises and system vulnerability assessments. As apart of the ARPs the Commission would also expect firms to have an annual independent review and to be notified of any system outages or material system changes.
It is unclear why ARPs have not been adopted, perhaps they were considered as an unnecessary and burdensome piece of red tape for firms to get through. While recent events suggest that ARPs should be re-visited, it is not clear that they would have helped Knight’s software glitch.
The German approach
The German government has proposed new rules that take another approach to solving this problem, although they were proposed before Knight’s software glitch. The German Ministry of Finance have drafted new rules that will require HFTs to apply for licensing as a financial institution and submit their algorithmic trades and strategies to the regulator.
But Rick Lane, chief technology officer at Trading Technologies, argued that this is still not an adequate solution as it is unlikely that the German regulators will have the specialist expertise to effectively analyse all of the complex source code that they would be presented with.
However, he did concede that the unintended consequence of this regulation might make for a safer environment because the delay caused by regulators looking at code would give financial firms more time for testing their software.
Solving the wrong problem
According to Lane though, the regulators are trying to solve the wrong problem.
“I think that a lot of people are looking at this from the perspective of: how we can prevent this from ever happening? And we’re going to waste a lot of cycles and time trying to solve that problem because it’s unsolvable,” he said.
The idea that regulators, and much of the industry in general, is missing the real problem was supported by Rik Turner, a senior analyst of financial services technology at Ovum.
Turner claimed that “The question should not be: has trading become too reliant on technology but rather have the companies that have deployed all this technology installed sufficient monitoring and management systems so that in the event that the technology goes wrong or human error sets off a process whereby the technology goes rampant?”
Good governance essential
He advocated that instead firms need to place greater emphasis on good governance. “Let’s not blame technology when it’s really human implementation of that technology and the way people manage it that’s gone wrong,” he said.
Turner stressed that there are multiple dimensions of risk and what Knight had been unprepared for was their operational risk, and as a result the blame lies at the feet of managers that failed to put effective monitoring systems in place.
“I’d be surprised if a few high profile executives don’t have to fall on their swords as a result of this,” he predicted. But most firms believe that they have good governance and Turner admitted that there is little that can be done to address this beyond shareholders being very active in ensuring that they have vigilant people on their executive board.
A new focus for innovation
Lane meanwhile, said that innovation is key to protecting the industry against future technological glitches. He argued that currently innovation in the financial markets centres too much around the speed of execution. Instead he said that the focus should be on identifying and mitigating problems like Knight had.
“Where we really need to focus on innovating is in providing the tools for programmers to develop a safer environment that is capable of using higher levels of logic in trade related checks,” added Lane.
Trading Technologies claim to have the edge in this department with its ADL algo design lab which is a programming environment that does exactly this, analysing a firm’s end product from the context of trading and flagging up any potentially unintended consequences.
Lane claimed that there is little that regulators can actually do to prevent software malfunctions doing damage and that there must be a bottom-up cultural shift in attitude amongst the financial industry towards placing the emphasis one making their systems “better” rather than “faster”.
Taking a tough stance
One thing that Lane does think that regulators could do to encourage this change is to take a tougher stance on firms that are found to not have adequate risk management and monitoring systems in place.
“I think that the penalties need to have teeth beyond just the money that you lose with an algo. If people know that they’ll be banned from trading or receive an additional fine above and beyond what they’ve lost then it’s a deterrent, just like any other punishment of crime,” he said.
Prevention is supposedly better than cure, but it may be that this is not possible in the case of software glitches.
The real failure at Knight Capital was not that an algo starting making bad trades, but that it was allowed to do so for 45 minutes. Today’s automated trading environment significantly magnifies the impact of a software glitch and so firms need to ensure that they appropriate monitoring systems and circuit breakers in place.
The regulators, for their part, need to show some leadership in determining what systems are necessary for firms to prevent aberrant trading and then mandate that they have them in place. Otherwise, it could be a lot worse next time.