High Frequency Trading and the New Algorithmic Ecosystem

by seangourley on August 9, 2012

A few months back I got an email from Bruce Cahan who was putting together a TEDx event called TEDxNewWallStreet. The purpose of the event was to challenge some of the fundamental paradigms of Wall Street, and Bruce wanted to know if I would put together a lecture for it based on my work with algorithms and Big Data. The only requirement is that the content was somewhat experimental, with bonus points if it was a touch controversial. Experimental and controversial are two great motivators for me so I told him I was a ‘maybe’. I spent the next week bouncing around ideas with some of my favorite thinkers in Silicon Valley, reading journal articles, sketching notes, analyzing equations and drinking a lot of coffee down at cafe Trieste. At the end of the week I finally came up with something that I wanted to talk about. A talk at the intersection of High Frequency Trading, machine readable news, and predator-prey ecosystem stability. A talk challenging our reliance-of and our control-by algorithms.

That was the video, click through to read more of the story behind the ideas

The talk hinged on three key papers that had caught my eye as being individually very interesting — unconnected perhaps, but potentially something novel lying in the white-space between them. A talk at the intersection of High Frequency Trading, machine readable news, and predator-prey ecosystem stability. A talk challenging our reliance-of and our control-by algorithms. These were are the papers,

Ecosystem stabilty
http://bit.ly/y0nzPQ
Machine readable social media analysis
http://www.necsi.edu/research/social/nyttwitter/nyt.pdf
Mathematical analysis of high frequency crashes
http://arxiv.org/abs/1202.1448/

And this is the pitch I came up with for the lecture and emailed through to Bruce;
——————————————————————-

Financial markets provide liquidity to the world, in today’s society the markets should be considered a public utility something more akin to clean water than the modern day casino that they have become. However financial markets, unlike water, are incredibly complex. Indeed the majority of financial transactions are algorithmic trades made by algorithms or non-human software agents. These trades happen at the sub 600ms time frame, beyond the limits of human decision making.

This type of trading is called high frequency trading, and the world that it inhabits is the new financial ecosystem. There are predatory algorithms, parasitic algorithms, and algorithms that are preyed upon. These algorithms are not smart at the moment, only capable of processing a few bytes of information and generating a few cents per trade. But they are getting smarter. They are now starting to process unstructured news, the kinds of news that humans read, and they are making decisions that can potentially generate more profit.

This high speed algorithmic world is not however isolated from the human time-scales of the world we live in. The instability of micro-second crashes is highly correlated with global macro instability. Indeed the 10 stocks with the most micro crashes were all major financial institutions that had massive volatility on a human time scale. The high frequency financial ecosystem is incredibly important, important perhaps as access to clean water. Yet instability in this system is correlated with instability in the world we humans inhabit. So it is too important to regulate out of existence and too damaging to leave unregulated.

We as a human species must control a system that is beyond the limits of our raw cognitive understanding. To do this we have two choices, we can create software to augment our human abilities. The software equivalent of a robotic exoskeleton to augment our human intelligence . Or we can create fully autonomous algorithmic agents, a new set of algorithmic species, and set them loose into the sub micro second world. Perhaps we can control the system by competing within it. Either way, within the next few years a robot will have read this text, processed it and made a trade before you’ve even got past the first sentence — in another 3 more, the machine will be the one writing the article in the first place.

——————————————————————-

Bruce replied back and said he loved the concept, and with that I went ahead and constructed a talk around this narrative arc. I needed to be able to take people into the world of algorithms, to let them see things that in some ways don’t really exist, to help them understand how these small packages of computer code could have such an impact on the world that we live in. Another week, lots more notes, plenty coffee and the outline slowly became a reality. I presented an initial cut of the lecture to Mike Driscoll and some of the other Big Data experts from the likes of Twitter, Square and Metamarkets, it’s always scary putting a new set of ideas out to a group like this. But they were helpful in refining the ideas and pushing me to a deeper understanding of the full implications of what I was saying. The talk got a lot better and my ideas clearer. After all of that, and in the space of two weeks I had my talk, and in it a new way of understanding High Frequency Trading. It was off to the TEDx NewWallStreet event at the computer history museum in Mountain View across from the Google campus, and speaking after Joe Lonsdale the founder of Palantir it was my turn to present these new ideas.

Here’s the video from the event (excuse the editing, it’s a bit of a rough cut — but the ideas are all there and the charts from the folks at Nanex do a great job of visually conveying the information)

Comments on this entry are closed.

Previous post:

Next post: