Reg AT → Reg NB (Not Bad): Determining the Shape of Reg AT

Ari Pine
Digital Gamma Blog
Published in
8 min readMar 14, 2019
Proper testing can often avoid this trading moment. Photo by Sebastian Herrmann on Unsplash

By Colleen Graham and Ari Pine

It has been about three years since the CFTC put the algorithmic trading community into a furor over a Reg AT proposal — it would have required that all automated trading systems (AT or ATS) file source code with the CFTC . The now CFTC Chairman Giancarlo put that proposal to rest while leaving open room for a new framework.

In our view, a control framework is necessary although regulators and the regulated need to be careful about what they wish for. The goal of Reg AT is to make the markets safe from “flash crashes”, electronic fat-fingers, and catastrophic failures, of which KCG’s implosion has been the most visible. Those are certainly very reasonable goals for the regulator in charge of the futures markets. It just needs to be practical.

The industry will or should welcome a framework which is largely principles-based and will not reinvent the wheel. It may require testing and/or certification without the unintended consequence of compromising highly sensitive proprietary code.

CFTC Commissioner Rostin Behnam, in fact, has the right lens on the issue:

Solving the problem is a matter of capturing as much of the unpredictable cases and errors as possible, and then figuring out how to train systems to deal with them. As one MIT engineer noted, it’s also a matter of accepting that humans are generally terrible overseers of highly automated systems.

Is the Industry Really Resisting Good Regulation?

Bad regulation is worse than no regulation at all. Yet there is considerable and increased pressure for traders and developers to move ideas into production while the regulators figure this out. With a standardized framework, the CFTC would be leveling the playing field so that there would not be a penalty (delayed implementation) for following best practice, including rigorous testing. That should be the goal of AT regulation.

This would not the first time that regulators have stepped in to overcome competitive forces or peer pressure. Consider the NHL’s requirement to wear helmets (to say nothing of goalie face masks). Although some players wore helmets, there were many, or possibly even a majority, that would have worn them except for the pressure of hockey culture. Once the NHL adopted the rule then players were able to wear helmets “under the cover” of enforced requirements.

In the case of hockey helmets, much of the cost of safety failures falls upon the individual player. However, in capital markets, much of the cost of failure is distributed among other participants, sometimes widely. This is a tragedy of the commons and is exactly why regulators ought to step in.

So why did the Original Reg AT Proposal Fail?

The objective of the original CFTC Proposed AT was not flawed but the method originally proposed certainly was.

The most onerous requirement of the Proposed Reg was the requirement to file Code with the CFTC. The industry did not want its proprietary technology in the hands of regulators fearing leakage of its proprietary secrets. That is certainly a reasonable concern, but the entire proposition was problematic. For starters, what is the source code in an automated trading system (hat tip Mike Ryan of TT)? Does it include the matching engine at the exchange? Does it include the code from an independent software vendor (ISV) to decipher exchange messages (feed handlers)? If I’ve used an algo platform like TT’s ADL or Deltix does it include their code, too? On top of all that, what would the CFTC do with the code? Even if everything was secure and everyone acted perfectly nobly, how could they even interpret the intention of the code. If you have written code, you know how difficult a task this is even when the coder is cooperative — or the coder is you a few weeks prior. If Reg AT had been implemented, it is certain that code would be written to be deliberately obfuscated in an effort to keep away reverse engineers. Firms may just have handed over machine code to make things even more difficult rather than, say, C++, Java or Python. The end result seems overwhelmingly likely to have been indecipherable at the CFTC yet probably also a risk of reverse engineering.

“Watch what they do, not what they say”

It is important to understand that AT designers are like everyone else under pressure to produce results. That means that everything at all times is being weighed against constraints, i.e., money and time. That de facto means that exhaustive testing is not always done. This is exactly the point at which regulators can be effective. ATS are complicated systems. Development nearly always requires the work of multiple engineers and often multiple companies. The opportunity for insufficient or mis-communication is overwhelming. Each participant may have their own set of assumptions about how a component ought to behave or what a variable means. It would be surprising if everything worked together in exactly the manner that all developers believed that it would.

Whether you are a programmer or a user, we all know that every piece of software that we use has bugs: we just may not yet know what they are. The question is not if they exist but how to minimize their negative impact.

Based on our research, the previously proposed Reg AT did not address determining a priori whether code would work or to attempt to short circuit behavioral biases of the trading firms. This is precisely what needs to change.

Don’t reinvent the wheel

It seems to us that reasonable best practices for design and testing are known and that the “MIT engineer” is correct about poor human oversight of automated processes. The Futures Industry Association (FIA) long ago published guidelines for the design of ATS. It appears, more or less, that shortcomings are known, good practices are known, but the community is coming up short in implementation. Specifically, coming up short at critical but infrequent moments. Those are exactly those cases that are least understood and most likely to be least tested. Given the already published FIA guidelines, it appears that what is necessary is behavior modification rather than new information. The CFTC, in other words, needs to mandate the already known best practices.

The industry has coalesced around FIX as a messaging system. A new standard can be set on ATS design built on best practices. It should not impact the purpose or content of the algorithm itself, but rather focus on the standardization of a minimum requirement of inputs and outputs and perhaps components. For instance, requiring that each ATS have an external automated risk management module that acts as a governor of the position size or frequency of trading for the ATS.

Let’s Do It On Principle

Common terminology in the regulatory space defines whether a particular regulation is prescriptive or principles based. Principles based asks the entity, industry or individual subject to such regulation to determine the details that ensure compliance with a thematic, non-detailed rule or regulation. Prescriptive regulation may require, for example, more specific testing, monitoring or training that is designed by the regulator. This is not an easy line to draw as a principles based regulation may not be strong enough to provide enough force to ensure proper testing. On the other hand, a prescriptive approach with the best intentions may become overly complex, fall behind new technologies or practices or simply require a whole lot of things but miss the plot. For instance, a power market where credit lines are necessary to participate may be very different to test than a short term interest rate market.

In our opinion, if rules and regulations for ATS make sense at a federal level, they must be principles based to allow for compliance that will consider the market, and industry practices to date. The goal is not and, realistically cannot be, to avoid any possible glitch in the markets going forward. Instead, the goal ought to be putting the appropriate level of regulation that ensures proper process is in place. The FIA already publishes a Guide to the Development and Operation of Automated Trading Systems. Exchanges also provide certification environments and the CME, at least, has a qualification process for any firms that will be the primary communication medium (feed handler). Even simply combining these three things together along with some defined scenarios would be a significant step forward with, what appears to be, minimal cost and effort from the industry.

This seems to fit Commissioner Behnam’s sensibility:

“It is time to work collaboratively with the industry and our fellow regulators on new regulatory initiatives aimed at establishing appropriate principles and structures in furtherance of well-reasoned, targeted and timely oversight — which may or may not take the form of regulation.”

Implement the Helmet Rule

Regulation can maximize its value by creating a mechanism that encourages or requires a comprehensive testing or qualification regimen.

Giancarlo supported the possibility of utilizing simulation and stress testing in closed environments. This is something that we (the authors) welcome and hope comes to pass. After all, it is far more important to get a measure on what is actually occurring rather than trying to divine potential actions from reading code. Actions of traders, whether automated or discretionary, is what will ultimately impact the operation of markets.

We further propose that the CFTC mandate a standardized suite of tests that cover fundamental operation, error operation, slow market conditions, fast market conditions. For example, an energy trading ATS ought not to melt down during an EIA number.

But Wait, There’s More

A thorough but efficient certification process may come to be seen as an asset if done at the clearing firm level. No customer of an FCM wants algos gone wild taking down the house. A firm with robust certifications ought to be seen as a safer place to trade from. In fact, it is quite possible and desirable that there will be certification service providers. A potential benefit for the market may be that a qualification process may also be a mechanism that vendors can use to showcase product performance.

Depending on the construction, there could also be some significant benefits to the regulators. Without needing to see any code, the regulators could assemble the ATS in the “test-o-sphere” and observe. It would become evident how the entire ecosystem behaved. It might be that regulators could begin understanding more clearly how the displayed orders change in reaction to changing volatility. Perhaps how market behavior changed with different incentive (pricing) mechanisms. It seems reasonable for automated market participants to back off on their trading in part or whole upon unexpected volatility or unrecognized behavior. Sure, it seems bad that markets may get thinner or wider, but is a better alternative to have ATS continue to trade when “confused”? Maybe it is a better idea to ensure a diverse ecosystem of participants including ATS, along with discretionary traders.

Participants would be concerned that under a sufficient amount of observation that strategy would be discerned. However, the danger would be no greater than what is currently dealt with by firms. Exchanges are able to monitor all orders and fills from a given identifier. For instance, at the CME, all activity can be traced to a given iLink. This data is currently used for the exchange to determine poor behavior, such as sending too many messages or data analysis to find out if there was manipulative behavior (see prior piece on spoofing!).

Finally

Traders follow the incentive structure embedded in the marketplace. Currently the incentive structure downplays the importance of allocating resources to testing but it is within the grasp of the CFTC to take advantage of known best practices to create a set of principles-based standards and certification processes. The nudge from the regulator ought to make the highly automated futures market significantly safer without burdening the industry with overwhelming regulations and keeping everyone’s intellectual property safe.

--

--