Past efforts to regulate the U.S. stock markets have brought mixed success and more than a few unintended consequences. With momentum building after the 2008 crisis to implement further regulations to financial markets, Torben G. Andersen, a professor of finance at the Kellogg School, thinks now is the time to push for a better understanding of the markets as they exist today. “If this ongoing regulation is to have bite and be sensible and be structured the right way, we do need to understand what is right and wrong in the current way things are organized,” says Andersen.
Once scholars have a better idea of how dark pools operate, for instance, or how high-frequency traders behave in the current market landscape, they could work with regulators to determine the impact of different kinds of regulations before they are put in place.
“It’s very dangerous to throw regulation out there, because you don’t want to kill trading in the U.S.,” says Andersen. This is why experimentation will be so important, he continues: “We take a subset of the markets, a subset of the stocks, make them subject to some regulation, follow it for 3 or 6 months, see what happens, and then decide whether you want to do this for all the stocks and all the exchanges.”
But before this kind of experimentation can happen, researchers will need access to better datasets—and the right tools to deal with this data.
A History of Stock Market Regulation
The U.S. stock markets used to be dominated by just a few exchanges, each of which had a near monopoly on certain stocks. The markets relied heavily on human interaction to operate. But as technology for trading securities began to improve around the world, the U.S. was slow to change.
Nervous that electronic exchanges elsewhere would overtake the U.S. financial markets, in 2007 the SEC implemented regulation designed to propel domestic markets into the digital age. “They wanted to make the system better,” says Andersen, “more competition and more access.”
This regulation, known as Reg NMS, spelled out a number of ground rules for both exchanges and brokers. Some of these rules have worked largely as expected. For instance, regulations on tick size—the minimum increment in which a price can move up or down—have successfully prevented brokers from jumping ahead of other traders by fragmenting prices further and further, and in the process making price comparisons nearly impossible.
But other rules, says Andersen, have placed unintended burdens on brokers and market infrastructure: “You’re forcing an interconnectedness that doesn’t necessarily have to be there.”
More Transparency, More Complexity
The most substantial regulations prodded the exchanges to publically post the best price at which a particular stock could be bought or sold and forced brokers to check all of these prices to get the best deal for their customers. The goal was to increase transparency and access to the markets, spurring competition.
But as straightforward as these rules sound, they actually made the markets considerably more complex. “Unforeseen consequences have made the system potentially a little unstable,” explains Andersen. “[Regulation] should induce competition, and it certainly has—but it almost made it too easy to get on the map as an exchange.”
"The amount of message traffic in the system, because of the way it’s structured, is absolutely enormous."
Because brokers are now compelled to seek out the best prices for their customers no matter where those prices are posted, even relatively new exchanges can generate a lot of order traffic, especially if they use financial incentives to lure brokers to post orders there instead of elsewhere. “By offering rebates and opening up an electronic platform,” says Andersen, “suddenly you can go from being nothing to being a fairly big exchange.”
And the boost in the number of exchanges, in turn, has increased the burden on brokers charged with finding the best prices for their customers. “We now have thirteen exchanges. Not two or three or four as we used to have,” Andersen explains.
The increase in order traffic necessitated by the regulations also strains the markets’ electronic infrastructure. “You can see how it gets complicated,” says Andersen. “If NASDAQ has a technological problem, and everybody wants to go and check their quotes and they can’t, the market will start grinding to a bit of a halt.”
Proliferation of Dark Pools
While some trading platforms embraced transparency, becoming proper exchanges, other trading platforms essentially went into hiding. In dark pools, orders are placed not publically but internally, meaning they are seen only by the institution running the pool. This institution then matches buyers of a given stock with sellers.
Dark pools offer some benefits to customers. “If I want to sell IBM and you want to buy IBM and we have the same broker, we could actually just trade on the midpoint” in a dark pool run by a bank, Andersen explains. “I don’t pay any commission and you don’t pay any commission—they can just clear it internally at minimal cost. Could save us both money.”
But dark pools offer the biggest benefits to mutual funds and other traders who wish to place very large orders without alerting others to their presence. To get the best prices on the public exchanges, these traders would have to split their orders into much smaller pieces and shop them around all over. But because they are forced to head to so many exchanges, it is possible for high-frequency traders to pick up on what they are trying to do. This may lead to “front-running,” where a high-frequency trader’s quicker trading technology allows him to “see” other traders’ current and likely future orders before they arrive at the intended destination. The high-frequency trader can then exploit his knowledge of impending order imbalances, making profitable trades and driving up execution costs for fund managers.
Dark pools appear to offer welcome cover. “If you go into a dark pool, those purchases are not immediately revealed,” says Andersen. “Over the course of a day you can do these purchases and only at the end of the day will it be revealed to the rest of the world that at these and these and these prices, these shares were traded.”
Dark Pool Dilemma
But the secrecy has its downsides. Namely, very little is understood about how the pools operate.
“The issue is that there’s no direct regulation as to how you actually got matched up in the dark pool,” says Andersen. This makes it nearly impossible to determine just how a trade was made—and thus whether a customer was given the best possible deal. Perhaps a broker is drawn to the pool not by prices, for instance, but by expected kickbacks. “It may also be they own the dark pool, so they can get some of the fees associated with trading there,” says Andersen. “It gets complicated. How do you truly enforce the NMS?”
It is also difficult to determine the extent to which the dark pools are filled with high-frequency traders. “The people running the dark pools have a huge interest in getting the trades done internally,” says Andersen. In order to do so, they need to attract liquidity. Do they invite high-speed traders into the pools to bring that liquidity—and if so, are these traders given information that other market participants are not? For example, traders could use knowledge about the ratio of institutional orders relative to retail orders in the pool to determine the likelihood that their trading partner is well informed about the underlying security values.
“It’s not regulated,” says Andersen.
Currently, competition from other pools and exchanges is all that exists to keep the dark pools honest. “They have an interest in being transparent to their customers to reassure them that things are functioning fairly,” says Andersen. And, of course, results matter too. “If you keep, for a month or two months, getting what seems to be a poor execution out of these dark pools, you can just say, I don’t want to send it to dark pools. I will go to another broker who promises not to send it there.”
Still, just how well dark pools serve customers remains unclear—even on aggregate. According to Andersen, the results of past academic studies are “all over the place” in terms of whether dark pools provide better or worse prices than the public exchanges.
Resistance to Regulation
A number of regulations are currently on the table to improve trading and further protect customers. These include rules about the types of rebates available to brokers, the trading documentation brokers are required to show their customers, how high-frequency traders are allowed to operate, and how dark pools can be run. Regulators might also consider slowing down trading from once a millisecond or microsecond to once a second in an effort to thwart the technological arms races that may offer very little social value in return.
But, Andersen cautions, before any new regulations are put in place, researchers and regulators need to have a better understanding of how the markets currently operate.
Coming to this understanding will not be easy. Many traders and trading platforms are reluctant to give scholars access to data from trading accounts. Mutual fund managers do not want others to know how they split their orders. High-frequency traders do not want other high-speed traders to know how they operate. “They ping, they jump in front of each other and place and cancel orders at lightning speed to explore the state of the market,” says Andersen. If competitors learn your strategies, “that can remove the profitability very quickly because they will do exactly the same thing, or even front-run you!”
Even the public exchanges are resistant to change. “The exchanges benefit from a lot of order traffic,” says Andersen. “So anything that hurts the high-frequency trading firms might hurt the exchanges,” he says. “They have a little bit of an alliance.”
A Big Data Solution?
Of course, regulators could nonetheless insist on access to real-time data from trading accounts. But this would still leave another large problem, according to Andersen. Namely, in order to be of any use to researchers and regulators, the data would need to be accessible in a manageable way.
“The amount of message traffic in the system, because of the way it’s structured, is absolutely enormous,” explains Andersen. “It’s almost infinitely complicated to figure out … what are the optimal strategies, what are people doing, where are the vulnerabilities in the system if something breaks down?”
Andersen recommends that a single entity—be it the government, a regulatory agency, or another nonprofit organization—establish and manage a single database, which would then be made accessible to scholars for a modest fee. This would free individual institutions of the burden of handling such large amounts of data, instead spreading the labor and cost across as many research organizations as possible.
Evidence of change is already in the air. The independent, nonprofit organization FINRA recently announced that it is making some anonymized, fixed-income market data sets available to researchers with approved projects—a good first step.