Do you think the computer-powered trading algorithms on Wall Street come equipped with Ctrl+Z?
New Jersey based Knight Capital lost $440 million yesterday thanks to a computer glitch. I use the word “glitch” here ironically – it seems to be tossed around when nobody wants to fess up to bad code and the responsible parties would rather assign blame to the computer itself, as if it has sinned. In Knight’s case, a “software malfunction” cost it four times its yearly earnings in one day. I would call this a “management malfunction”, or “human error”, but maybe I’m just not hip to the Wall Street jargon. I’m sure that Knight’s trading software did exactly what it was coded to do, and that its designers simply didn’t account for the circumstances that gave rise to the epic loss.
This is what philosopher Nassim Taleb would call a Black Swan, a highly disruptive, unpredictable (or at least unpredicted) event. One of the key points in Taleb’s The Black Swan is that humans are astonishingly blind to uncertainty and far too confident of their ability to predict the future (this conclusion is bolstered by the excellent Thinking, Fast and Slow by Daniel Kahneman). A corresponding weakness is our tendency to “understand” the past with a degree of coherency that is unjustifiable. Hence, we think that unpredictable events could have been predicted after all, if only we had paid attention to the right data. In the next few days, I imagine a lot of very smart people will get to the bottom of what happened in the Knight computer, and make sure this specific calamity never happens again — if Knight survives long enough for that to make a difference. But in my mind, it’s not a matter of looking for the problem that led to this particular meltdown. I’m sure some valuable lessons will be learned from the shambolic Knight episode about the errors in one market model or one line of code, but will that help us predict the next variety of financial Armageddon?
Today, we have such a huge wave of data washing over us that trying to filter out the relevant bits is like holding a Brita up to a tsunami. Our globalized economy is so complex that nobody could hope to fully understand what makes it tick. The more you know about the real world, the fuzzier and less ineligible it begins to look. Where we don’t suffer from data overload, we suffer from excessive secrecy (especially with regard to the workings of government). Finance is opaque to even the most plugged-in data lords or politically-connected hedge fund managers. Trillions of dollars of derivatives are traded out of public view, and the CEOs of the too-big-to-fails have no idea how the quants from MIT are making them money through automated trading.
Instead of trying to predict the form and timing of the next screw-up, we should be looking at the underlying technological and legal rule sets that have created a dangerously flawed system. The fragile world of high finance is so vulnerable to these kind of events that we’re guaranteed to see one variety of disastrous “anomaly” after another, unless the whole system is made more robust. We will never be able to eliminate anomalies, but we can make systems that are not so vulnerable to anomalies.
This isn’t about computers. Remember the “rogue trader” who was able to lose $2 billion for UBS bank in 2011? What kind of system do we have if one guy can really toss around that kind of money without authorization? (The fact that he would have been rewarded, not charged with fraud, if he had made his firm money, is evidence of the system’s perverse incentives which make it even more fragile) More recently, JP Morgan Chase lost upwards of $5 billion on a trade that was meant to hedge (!) risk. It’s funny to hear CEO Jackie Dimon say, “We’re not making light of this error, but we do think it’s an isolated event.” Unfortunately, nothing has been done to make our financial system more robust since we were all made keenly aware of its weakness in 2008. As a result, the costs of the system are still being offloaded onto the public through implicit bailout guarantees. I shouldn’t even care that some bank was stupid enough to lose billions on one trade, but since its health is vital to the entire world economy, I’m forced to care.
More troubling is that the impact and frequency of Black Swans are bound to increase as the world becomes more interconnected. Case in point: the collateralized debt obligations (CDOs) that banks created in the early 2000s were meant to reduce risk, but ended up spreading risk throughout the system. Warren Buffet famously labeled derivatives like these “financial weapons of mass destruction”. I think that we are building up an unseen arsenal of systemic WMDs, the nature of which we can only guess.