The subject of asset bubbles and market crashes has fascinated me for more than 20 years. As an options market maker for Susquehanna International Group (“SIG”), extreme price movements were a daily source of concern. I sat next to Jeff Yass for years and watched him manage option positions in thousands of different stocks. Almost daily he would be celebrating a big win in a stock that had an unusually large move (SIG loves to own the “teenie” puts).

At some point, a very interesting question popped into my head:

Why is it that 10-sigma events happen all the time?

## This Tiger Cub Giant Is Betting On Banks And Tech Stocks In The Recovery

The first two months of the third quarter were the best months for D1 Capital Partners' public portfolio since inception, that's according to a copy of the firm's August update, which ValueWalk has been able to review. Q2 2020 hedge fund letters, conferences and more According to the update, D1's public portfolio returned 20.1% gross Read More

Current risk models and option pricing models suggest that these events should happen almost never.^{(1)}

The science of risk management and derivative pricing revolves around one thing: estimating future prices of assets. For over 50 years the world of finance has defaulted to using a normal distribution to estimate the probabilities of future price moves. Why? Is it a physical law of the universe that asset prices follow a normal distribution? There are plenty of other statistical distributions that asset prices could follow.

*So, why use the Normal distribution?*

Sadly, the answer is that the normal distribution is mathematically neat and easy to use. It doesn’t work, but it’s easy to use. Some of the smartest people in the world have lost billions of dollars because they wanted to use an easy formula (i.e. Long Term Capital Management).

Over the years people began to realize that the normal distribution does a terrible job in predicting future price moves. But, instead of trying to find a distribution that does work, people decided to try to mangle the normal into something useful. They squeezed and stretched the normal with intelligent sounding terms like leptokurtosis and skew. It does work better now, but still not great.

### Is There A Better Model For Stocks Than the Normal Distribution?

For many years I have been using a different distribution that works much better, and has made a fortune for the companies in which I have worked. This formula is simple, easy to use, and has the added benefit of being able to arguably identify market bubbles before they burst. As you probably know, when an asset is in the midst of a bubble, the probability of an unusually large downside move is greatly increased. The good news is that the options world does not know how to identify bubbles, and often prices the out-of-the-money puts too low.

The normal curve is one member of a family of distributions known as “stable distributions.” If we assume mean = 0 and variance = 1, the characteristic equation for this family is . The difficult part of using this equation is estimating the parameter , which can vary When , you get a normal distribution, when you get a Cauchy distribution, and when you get a Pareto-Levy distribution. Financial engineers have defaulted to assuming at all times, but there is no reason to believe this accurately models asset returns. History shows us that a more accurate distribution should be more peaked around the mean, and have fatter tails. Using a Pareto-Levy distribution with produces just such a curve (fig. 1).

This distribution does what financial engineers have been trying to do to the normal.

Now, as mentioned, using does not produce accurate estimates for future price moves. So, the question becomes what value of should be used? The potential answer to this question is surprisingly simple and elegant. This can be calculated based on historical price returns of each individual asset you are modeling. In fact, the answer is to use different ’s for different assets depending on their individual historical returns.

### Introducing the Hurst Exponent

The method of calculating uses a statistical procedure called R/S (range over scale) analysis, which is then used to calculate a Hurst Exponent for the data in question.^{(2)} Harold Hurst used this method to calculate how high to build the Aswan Dam on the Nile River in Egypt in the early 1900’s. He recognized that historical Nile River flooding data (which went back thousands of years) was not normally distributed, but displayed a long-term “memory.” This exponent has been given the symbol H in his honor. This method of statistical analysis is not widely used, but there is plenty of literature on the subject.

The Hurst Exponent (and R/S analysis) basically measures how fast a data set is scaling over time. The parameter H varies from . The key information that H reveals is that if H=0.5 then the data set is scaling as a purely random system would scale. For the data set is scaling faster than random, and for 0 < H < 0.5, the data set is scaling slower than random. For our purposes, the focus will be on assets where H > 0.5. This means that the price is scaling in time faster than random (i.e. the price is going up too quickly relative to the daily price moves). An asset whose price slowly increases in this fashion might be contradictory to The Efficient Market Hypothesis, making the normal distribution inappropriate for use in predicting future price moves.

As discussed earlier we want to use the Pareto-Levy distribution but we don’t know the correct to use in the formula. Amazingly, the Hurst exponent is related to the through the simple formula . So, when H = 0.5, then . This means that if we do R/S analysis of historical prices and determine they are random, then we can use the normal distribution. If we do R/S analysis and determine that the prices are not random then we simply use the Pareto-Levy distribution and calculate the correct . The key thing to understand is that the closer comes to 1 (which means H approaches 1), the fatter the tails get.

The question now is how to use H once it is calculated. In my experience, this Hurst Exponent usually falls between .4 and .8 (0.4 < H < 0.8), and is often near 0.5, indicating that the asset’s returns are random. In the cases where H is near 0.5 the widely used normal distribution is fine to use. The most interesting cases occur when 0.6 < H < 0.8, which may indicate a large amount of “herding,” or “crowding” in the asset. To summarize, I categorize assets as follows:

- 0.4 < H < 0.6 Asset is approximately random
- 0.61 < H < 0.7 Moderate bubble forming (i.e. moderate herding)
- 0.71 < H < 0.8 Critical bubble forming (i.e. critical herding)

The key to understanding these categories is that when 0.61 < H < 0.8 there is a *hidden* amount of volatility and risk associated with this asset, because there is a potential bubble forming. Remember, this Hurst range translates to , and this range of alpha has INFINITE variance. The market completely misunderstands and misprices this hidden tail risk.

So, when H > 0.7, a bubble might be forming because prices are increasing in a non-random way. This price action is often not be caused by news or corporate events, but by non-fundamentals-based market participants.

### What Does the Hurst Exponent Say About the Markets Today?

The history of bubbles and crashes in markets is well documented, but rarely is there agreement about bubbles until after they burst. With this simple formula, bubbles can be mathematically identified and categorized BEFORE they burst. In addition, options traders should consider this formula in their pricing algorithms, otherwise, the out-of-the-money puts will likely be mispriced.

As of 10/18/2017, the S&P 500 index currently has an H exponent of 0.71, and is therefore in the critical bubble category. The range of distributions are arguably much larger than what the normal distribution would suggest. Arguably, the market has a lot of hidden risk right now, and the VIX at 10 is deceiving. That said, markets can remain irrational for a long time…

- The views and opinions expressed herein are those of the author and do not necessarily reflect the views of Alpha Architect, its affiliates or its employees. Our full disclosures are available here. Definitions of common statistics used in our analysis are available here (towards the bottom).
- Join thousands of other readers and subscribe to our blog.
- This site provides
**NO**information on our value ETFs or our momentum ETFs. Please refer to this site.

**References**

- Just for context a 32% volatility stock that drops 20% in a day is a 10-sigma event. I’m pretty sure everyone has seen that happen.
- Here is a great site on calculating the Hurst Exponent.

**Article by Keith Kline, Alpha Architect**