High Frequency Trading: Wall Street’s Doomsday Machine?

  • Share
  • Read Later
Brendan McDermid / Reuters

Traders work at the Knight Capital kiosk on the floor of the New York Stock Exchange Aug. 3, 2012.

Another week, another Wall Street scandal, and another opportunity for pundits to bemoan the incompetence and venality of America’s financial professionals. Last Wednesday’s near collapse of Knight Capital Partners – in which a bug in one of its high-frequency trading algorithms caused the firm to lose $440 million – has raised concerns about high frequency trading and what the practice means for the safety and trustworthiness of our financial markets. But what is high-frequency trading, and is it really all that dangerous?

High frequency trading is a catch-all term that describes the practice of firms using high-powered computers to execute trades at very fast speeds – sometimes thousands or millions of trades per second. These systems have developed over the past ten years, and began to really dominate Wall Street over the last five. For example, a high-frequency trader might try to take advantage of miniscule differences in prices between securities offered on different exchanges: ABC stock could be offered for one price in New York and for a slightly higher price in London. With a high-powered computer and an “algorithm,” a trader could buy the cheap stock and sell the expensive one almost simultaneously, making an almost risk-free profit for himself.

At first blush, it appears high-frequency traders have done what market observers generally like, which is increase the amount of trading going on at any given time (what traders call volume) and the ease with which someone can buy or sell a given security (commonly known as liquidity). Basically high-speed traders add to the overall action in a market, which, at least theoretically, is supposed to make markets more accurate and efficient.

But not everyone agrees that the thousands of extra trades per second that some computer algorithms are executing are actually all that good for the market. In a recent paper titled, “The Dark Side of Trading,” Emory University accounting professor Ilia D. Dichev argues that while some high-frequency trading can be beneficial, the scope of high-frequency activity in the market today — which accounts for up to 70% of all trades by some estimates — has drowned out the input from more traditional investors who partake in good-old-fashioned “fundamental” analysis of companies, i.e. the analysis of financial statements and business plans.

When only a small minority of market participants are trading on actual information about the companies they are buying and selling, and the rest are trading on what computer programs think about those participants actions, the market can become unmoored from fundamentals, more volatile and less efficient.

Perhaps even more problematic, high-speed trading systems may also pose risks to the stability of the overall financial system. The Knight Capital incident last week showed how the speed and power of the computers can amplify a small glitch to disastrous effect. The details of what exactly went wrong with one of Knight Capital’s computer programs are as yet unknown, but whatever the problem was, it only took 30 minutes for the problem to cause more hundreds of millions of dollars in losses and put the firm at risk of bankruptcy.

As you can imagine, this kind of stuff leads to systemic dangers. If a similar incident were to fell a systemically important financial institution, it could theoretically put the entire economy at risk. The so-called “flash crash” of 2010, when the Dow Jones lost more than 1,000 points in a matter of minutes, was caused by glitches in high-frequency trading systems — and since that time issues with the Facebook and BATS IPOs have shown that these sorts of bugs are all too common on Wall Street. Dave Cliff, a computer science professor at the University of Bristol and one-time pioneer of high-frequency trading systems, is worried about just this sort of danger. In an interview with HFT Review, in December, he warned of the systemic issues that the increased computerization of our financial system poses:

“One of the things that we have focused on . . . for the last five years is the extent on which the global financial markets are now essentially a single, planetary-wide, ultra-large scale complex IT system . . . The 6th May Flash Crash was the first real sign that actually our concern was justified, that events could happen at an unprecedented scale, in terms of the magnitude of the drop and the speed at which it happened.”

While issues of financial market efficiency and fairness are important, the idea that one of these computer programs could bring down the entire financial system is much more frightening for the average citizen. The SEC has already begun looking into new rules to prevent these sorts of problems from spreading to the broader market. According to a report in the Wall Street Journal:

“The SEC expects to push forward with rules that will require exchanges and other market platforms to regularly test for software glitches and inform regulators of computer problems . . . At the same time, the agency also is weighing whether to require tighter rules for trading firms such as Knight, including whether a senior official at each firm should certify that the company’s technology systems are up to snuff.”

We are increasingly dependent on computers for all that we do, and the government won’t always be able to prevent their malfunctioning from causing serious problems. But the many glitches that have plagued financial markets in the past couple of years should serve as a sobering reminder that financial markets have evolved much more quickly in the past decade than regulators have.

As Scott Patterson, author of Dark Poolsa book about high-frequency trading, said to Yahoo Finance Monday, “We have seen a massive revolution in how exchanges work. It’s been put in place extremely fast . . . the problem is that the race for profits at the exchanges and at the high-frequency firms has outpaced their ability to manage risk.”

6 comments
Lingchao Zuo
Lingchao Zuo

Only 4 weeks for the

premier senior-level conference in New York City for high-frequency traders,

technologists, risk managers and operations executives looking to refine their

edge: http://bit.ly/PYm0Xo

Six000MileYear
Six000MileYear

This article points the finger correctly at HFT as a bad thing, but does not complete the list of problems HFT creates.

1.) Algorithmic trading places orders to entice traders, or other HFT players to adjust their order. These false bids can shut a network down, preventing real people from placing their orders.

2.) Algorithms can only be tested with historical data. Much of historical data does not include other HFT players. So conclusions from tests are invalid. AND there is no way to test how stable the market will be when HFT's play agains HFT's.

The report erroneously states investors buy and sell shares based on fundamentals. 

1.) Long term trends are NOT determined by fundamentals. I've performed spectral anaylsis on financial markets, and the results show dominant cycles of rise and fall. This also proves financial markets are not purely random.

2.) When plotting the Dow Industrial Average on a semi-log chart, one can draw a straight line with a positive slope. This proves markets are not random, AND markets are UNSTABLE BY NATURE.

3.) People invest as herds because they can't process complex data, or the data set is incomplete. People rely on informal group concensus when making a financial decision. These behaviors are patterened and show up on the charts. Someone with training in Elliott Wave Theory ( a 12 year old can master this technique) has a better chance of identify market trends, and when they are about to change months before fundamental data is reported.

Thomas SF
Thomas SF

A complete collapse may be the best thing for the people. 

paul_from_cleveland
paul_from_cleveland

"caused the firm too loose $440 million"? Don't they teach spelling anymore?