• John Hawver

Renaissance, Moore's Law, and Alpha

I’m back! It’s been a while since I last blogged and I have lots of ideas stored up. The first comes from reading a book by Gregory Zuckerman: “The Man Who Solved the Market: How Jim Simons Launched the Quant Revolution.”


The first three-quarters of the book is excellent if you have an interest in trading, finance, and or technology. The last quarter dives a bit too much into politics but doesn’t diminish the overall quality of the book. Zuckerman does an excellent job of laying out the chronology of Renaissance Technology from bits and pieces that most of us in the industry had noticed.


Certain details particularly struck me, for instance, data collection and cleaning, using ridge regressions (implied), applying machine learning (neural networks) to their strategies; Renaissance was using these techniques well before the broader industry commoditized them and in some cases perhaps before industry leaders thought to use them. Living up to its name, Renaissance was/is at the leading edge of technology.


And you know what else struck me, why now? After beating the market and competitors for years and being known for utmost secrecy, why would Renaissance break its own Omerta?


Moore’s law

, postulated by Gordon Moore in 1965, says that the number of transistors per integrated circuit will double every two years. This prediction has held up remarkably well and in 2019, the first 7-nanometer chips were introduced. If the log-linear relationship continues to hold over the next decade, we could see 1-nanometer chips by 2030. To get there, all kinds of non-trivial physics and materials issues will have to be solved. If we do get there, then by the end of the next decade, that is a 7x to 15x improvement in computing power. Pause to think about how powerful iPhone 2030 will be.


The story of Renaissance could also be described as how Moore’s law, and the associated technological improvements that came with it - data, connectivity, memory, networking, and especially machine learning - can be applied to trading and investing. Jim Simons, at the very least, intuited the effect Moore’s law would have and pursued the opportunity in finance before many others did.


Alpha is the active return on investment above the “beta” of just being long the market. One can think of Alpha as being caused by a collection of irrational decisions by some set of market players for a whole host of reasons. In the parlance of physics, alpha would be called noise. Trading algorithms are built to extract alpha, or noise, from the market, essentially transforming it into heat (going out the back of servers) and money. At any given timeframe only so much alpha/noise exists.


The investing universe is a reasonably closed set. There are only a limited number of assets to trade and, despite the growing set of alternative data, only a limited amount of data impacts the price of those assets. So the winners in trading and investing tend to be those that apply data objectively and efficiently with a repeatable process. To achieve that type of process there are two dimensions firms can pursue: latency and quantitative research.


Latency is more deterministic; being faster than your competitors is generally good and can be achieved through engineering and spending resources. Over the last decade, there have been dramatic changes to the latency structure of the market. The book Flash Boys and the movie Hummingbird Project sensationalize these changes but are, in reality, years behind the actual industry. Microwave links between market sites have become the norm; measurements previously done in milliseconds are now in micro or nanoseconds; processes previously run on servers now run on FPGA cards. But the most profound change is not how fast the fastest is anymore, it is the vast shrinking of the distribution of latency between market players. Every type of firm in the investing universe is aware of latency differences and has taken some sort of action to minimize the negative effects.


Similarly, the quantitative research landscape has changed dramatically in the last decade. Previously, to do research with a variety of models you had to build the infrastructure yourself. In the early days building your own Lasso or Ridge regression was a clear advantage. Now, with the advent of Python, R, Tensorflow, etc and the multiple packages that are available to support all types of models and research, the “model” part is much more trivial than data cleaning and feature (alpha) engineering. And with advances in GPU’s and cloud computing, deep learning and neural nets have become common and accessible. Data becomes the real differentiator.


Now, to tie up these threads. Why would Renaissance break their Omerta? I don’t think there are really that many secrets left to keep. The relentless march of Moore’s law in finance has commoditized the extraction of alpha. The alpha pie as it were is only as big as the number of irrational decisions made. With algorithms (presumably more objective than a human) making more trading and investing decisions by the day, projecting forward with Moore’s law, the alpha pie will inevitably shrink. The market will trend towards Fama’s famed efficient market boundary. If you’re going to have a book written about your fund, doing it somewhere near the top makes sense.


So what does the relentless push of Moore’s Law mean for the rest of the investing world over the next decade?


1. Large firms that currently have a technological advantage in terms of latency, market structure, data, and/or machine learning, will continue to get bigger and take a larger share of the pie if they haven’t already gotten to a maximal point.


2. Alpha will become increasingly difficult for a human to repeatedly find at almost any investment horizon. We’re already seeing this in the dismal performance of non-quantitative hedge funds and the inability of simple linear factor investing to beat the market. Passive investment strategies will continue to dominate active.


3. Any firm that does not incorporate technology/machine learning as a core pillar in their process or structure will get left behind. Investment firms, in particular, will have to be able to scale technologically to combat shrinking fee structures.


4. Data will be the investment “moat” of the 21st century.





Source: Wikipedia


©2020 by Mud Muscle and Markets - Disclaimer