In “Black box” blues I argued that automated trading was a potentially dangerous element to include in a quantitative investment strategy, citing the “program trading / portfolio insurance” crash of 1987. When the market started falling in 1987 the computer programs caused the writers of derivatives to sell on every down-tick, which some suggest exacerbated the crash. Here’s New York University’s Richard Sylla discussing the causes (care of Wikipedia).
The internal reasons included innovations with index futures and portfolio insurance. I’ve seen accounts that maybe roughly half the trading on that day was a small number of institutions with portfolio insurance. Big guys were dumping their stock. Also, the futures market in Chicago was even lower than the stock market, and people tried to arbitrage that. The proper strategy was to buy futures in Chicago and sell in the New York cash market. It made it hard — the portfolio insurance people were also trying to sell their stock at the same time.
The Economist’s Buttonwood column has an article, Model behaviour: The drawbacks of automated trading, which argues along the same lines that automated trading is potentially problematic where too many managers follow the same approach:
[If] you feed the same data into computers in search of anomalies, they are likely to come up with similar answers. This can lead to some violent market lurches.
Buttonwood divides the quantitative approaches to investing into at three different types and their potential for providing a stabilizing influence on the market or throwing fuel on the fire in a crash:
1. Trend-following, the basis of which is that markets have “momentum”:
The model can range across markets and go short (bet on falling prices) as well as long, so the theory is that there will always be some kind of trend to exploit. A paper by AQR, a hedge-fund group, found that a simple trend-following system produced a 17.8% annual return over the period from 1985 to 2009. But such systems are vulnerable to turning-points in the markets, in which prices suddenly stop rising and start to fall (or vice versa). In late 2009 the problem for AHL seemed to be that bond markets and currencies, notably the dollar, seemed to change direction.
2. Value, which seeks securities that are cheap according to “a specific set of criteria such as dividend yields, asset values and so on:”
The value effect works on a much longer time horizon than momentum, so that investors using those models may be buying what the momentum models are selling. The effect should be to stabilise markets.
3. Arbitrage, which exploits price differentials between securities where no such price differential should exist:
This ceaseless activity, however, has led to a kind of arms race in which trades are conducted faster and faster. Computers now try to take advantage of arbitrage opportunities that last milliseconds, rather than hours. Servers are sited as close as possible to stock exchanges to minimise the time taken for orders to travel down the wires.
In arguing that automated trading can be problematic where too many managers pursue the same strategy, Buttonwood gives the example of the August 2007 crash, which sounds eerily similar to Sylla’s explanation for the 1987 crash above:
A previous example occurred in August 2007 when a lot of them got into trouble at the same time. Back then the problem was that too many managers were following a similar approach. As the credit crunch forced them to cut their positions, they tried to sell the same shares at once. Prices fell sharply and portfolios that were assumed to be well-diversified turned out to be highly correlated.
It is interesting that over-crowding is the same problem identified by GSAM in Goldman Claims Momentum And Value Quant Strategies Now Overcrowded, Future Returns Negligible. In that presentation, Robert Litterman, Goldman Sachs’ Head of Quantitative Resources, said:
Computer-driven hedge funds must hunt for new areas to exploit as some areas of making money have become so overcrowded they may no longer be profitable, according to Goldman Sachs Asset Management. Robert Litterman, managing director and head of quantitative resources, said strategies such as those which focus on price rises in cheaply-valued stocks, which latch onto market momentum or which trade currencies, had become very crowded.
Litterman argued that only special situations and event-driven strategies that focus on mergers or restructuring provide opportunities for profit (perhaps because these strategies require human judgement and interaction):
What we’re going to have to do to be successful is to be more dynamic and more opportunistic and focus especially on more proprietary forecasting signals … and exploit shorter-term opportunistic and event-driven types of phenomenon.
As we’ve seen before, human judgement is often flawed. Buttonwood says:
Computers may not have the human frailties (like an aversion to taking losses) that traditional fund managers display. But turning the markets over to the machines will not necessarily make them any less volatile.
And we’ve come full circle: Human’s are flawed, computers are the answer. Computers are flawed, humans are the answer. How to break the deadlock? I think it’s time for Taleb’s skeptical empiricist to emerge. More to come.