Oct. 14: Updates with comments from Nasdaq.
It’s already Thursday, and the markets haven’t broken yet this week. No trading halts, no software glitches, not one rogue computer algorithm gone haywire. Not even an allegation of speed traders getting a jump on market-moving data. Hurrah! Of course, there’s always Friday.
This is all to say that the markets haven’t exactly been working smoothly recently. So Nasdaq (NDAQ) picked a good week to announce that it’s building the high-frequency-trading equivalent of a test track, where firms can test their trading algorithms before letting them loose in the real markets, where real money is at stake. The simulator will come packed with historical trading data that the algorithms can churn through to mimic actual market conditions. Curious how much money a particular trading strategy would’ve made (or lost) during the last debt-ceiling crisis? Now traders can find out for sure without having to guess.
Nasdaq is expected to unveil the simulator in January and charge trading firms a couple hundred dollars per simulation. That’s a whole lot cheaper than the millions it would likely cost a firm to build an equivalent testing environment for itself. Which is why so few firms have one.
The simulator will live in Nasdaq’s massive data center in Carteret, N.J., and is based on a similar testing platform developed in 2009 by the New Jersey-based high-frequency-trading firm Tradeworx. This is the same firm that for $2.5 million built the U.S. Securities and Exchange Commission’s high-tech surveillance system known as Midas (Market Information Data Analytics System), which helps regulators keep track of what speed traders are up to.
Why the lead time? Nasdaq would like nothing more than to show regulators how serious it is about tightening up risk management and increasing testing standards, especially amid so many recent screw-ups, including Nasdaq’s own Aug. 22 trading halt. “Policymakers want pieces of trading software to be more robustly tested before they’re unleashed in the market,” says Manoj Narang, Tradeworx’s chief executive officer.
Narang says he first reached out to Nasdaq and a few other exchanges about using its simulator as the backbone for something that could be made available to the rest of the trading industry. “We initially tapped this simulation software because buy-side and sell-side firms want cost-efficient algo tests not only to protect their business but for the protection of the interwoven financial services ecosystem,” says Eric Noll, Nasdaq’s executive vice president of transaction services. Nasdaq is also responding to recent proposals from the SEC to mandate stricter testing systems. Speed-traders are dealing with lower profits, and need to lower their risk. “Trading is all about risk vs. reward,” Narang says. “And as the rewards have diminished, traders realized that risks need to be brought down as well.”
If they choose to use the simulator, it will be most useful for the Chicago-based speed-trading firms, which Narang says tend to trade first and evaluate later. It’ll also give large buy-side firms like mutual funds the chance to see just how outgunned they truly are by speed-trading market-maker competitors. Narang says buy-side firms will be able to measure how much extra money they might’ve been able to make—if only they’d been a few milliseconds faster.
Mostly, though, the simulator is a chance for the high-frequency-trading community to prove it’s growing up and capable of policing itself, rather than needing regulators to do so. “In general, what you’re seeing is the industry starting to mature and take a leadership role in defining its own standards,” says Ben Van Vliet, a finance professor at the Illinois Institute of Technology who has spent the last two years pushing the trading industry to adopt a set of standards to help prevent blowups.
What’s not clear, however, is whether Nasdaq’s new test track would have prevented what is so far the biggest high-frequency screw-up of all: Knight Capital’s (KCG) $440 million debacle that was triggered when its algorithms went on a crazy buying spree and destroyed half the firm’s total value in August 2012. “No amount of testing can ever reduce the probability of failure to zero,” says Van Vliet.