When I was a little kid we had a sand box. It was a quicksand box. I was an only child … eventually.
Steven Wright

A brief note today on what might be an arcane subject for some but is a great example of the most basic question in risk management – are you thinking about your risk questions in a way that fits the fundamental nature of your data? Do you understand the fundamental nature of your data? Our business incentivizes us to build complex and ingenious models and data analysis systems in order to generate an edge or dodge a bullet. But are we building our elaborate mental constructs on solid ground? Or on quicksand?

I’ve spent a lot of time recently talking with clients about measuring market risk across a wide range of asset classes and securities as part of an adaptive investment strategy, and I get a lot of smart questions. One of the best was deceptively simple – what do you think about using implied volatility to measure risk? – and that’s the question I want to use to illustrate a larger point.

First let’s unpack the question. Volatility is a measurement of how violently the returns of a security jump around, and in professional investment circles the word “volatility” is typically used as shorthand for risk – the higher the volatility, the greater the embedded risk. There are some valid concerns and exceptions to this conflation of the two concepts, but by and large I think it’s a very useful connection.

Within the general concept of volatility there are two basic ways of measuring it. You can look backwards at historical prices over some time period to figure out how violently those prices actually jumped around – what’s called “realized volatility” – or you can look forward at option prices for that same security and figure out how violently investors expect that prices will jump around in the future – what’s called “implied volatility”. Both flavors of volatility have important uses, even though they mean something quite different. For example, a beta measurement (how much a security’s price moves relative to an underlying index) is based on realized volatility. On the other hand, the VIX index – the most commonly reported gauge of overall market risk or complacency – is entirely based on the implied volatility of short to medium-term options on the S&P 500.

The big drawback to using realized or historical volatility is that it is, by nature, backwards looking. It tells you exactly where you’ve been, but only by extrapolation provides a signal for where you are going. In a business where you always want to be looking forward, this is a problem. Using realized volatility means that you will always be reacting to changes in the broad market characteristics of your portfolio; you will never be proactive to looming changes that might well be embedded within the “wisdom of the crowd” as found in forward-looking options prices. If you’re relying on realized volatility, no matter how sensitively or smartly you set the timing parameters, you will always be late. This was the point of the smart question I was asked: isn’t there useful information in the risk expectations of market participants, information that allows you to be proactive rather than reactive … and shouldn’t you be using that information as you seek to balance risk across your portfolio?

My answer: yes … and no. Yes, there is useful information in implied volatility for many purposes. But no, not for the purpose of asset allocation. Why not? Because we are living in the Golden Age of Central Bankers, and that wreaks havoc on the fundamental nature of market expectations data.

Here’s an example I’ve used before to illustrate this point, courtesy of Ed Tom and the Credit Suisse derivatives strategy group. Figures 1 shows the term structure (implied price level at different future times based on prices paid for options) of the VIX index on October 15th, 2012.

Central Banker

Figure 1: 8-Month Forward VIX Term Structure

If you recall, there was great consternation regarding the Fiscal Cliff at this time, not to mention the uncertainty surrounding the November elections. That consternation and uncertainty is reflected in the term structure, as it is much steeper than is typical for a spot VIX level of 15, indicating that the market is anticipating S&P 500 volatility to be progressively higher to an unusual degree from January 2013 onwards. The way to read this chart is that the market expects a VIX level of 18 three months in the future (January 15), 19 three and a half months in the future (January 31), 20 four months in the future (February 15), and so on. All of these results are higher than one would typically expect for future expectations of the VIX from this starting point (essentially flat at 17).

Now take a look at Figure 2, which shows the Credit Suisse Group AG (ADR) (NYSE:CS) estimation of the underlying distribution of VIX expectations for January 31, 2013.

Figure 2: January 31, 2013 Forward Estimated VIX Exposures

The way to read this chart is that a lot of market participants have a Bullish view (low VIX) for what the world will look like on January 31, with a peak frequency (greatest number of bullish contracts) at 15 and a fairly narrow distribution of expectations around that. Another group of market participants clearly have a Bearish view (high VIX) of the world on January 31, with a peak frequency around 24 and a fairly broad distribution around that.

So what’s the problem? The problem is that Figure 1, which is what you would come up with based on public options data, says that the most likely implied price for the VIX on January 31, 2013 is 19. But Figure 2, which is based on the trading data that Credit Suisse Group AG (ADR) (NYSE:CS) collects, says that a VIX level of 19 is the least likely outcome. What Figure 2 tells you is that almost no one expects that the outcome will end up in the middle at a price of 19, even if that is the average implied price of all the exposures.

Usually the average implied price of a security is also the most likely estimated price outcome of the security. That is, if options on a security imply an average price of 19 a few months from now, exposures will generally form some sort of bell curve centered on the price of 19. The most common estimation of the price would be 19, with fewer people estimating a higher price and fewer people estimating a lower price. But in those situations – like expectations of future VIX levels on October 15, 2012 – where there’s not a single-peaked distribution, all of our math and all of our models and all of our intuitively held assumptions go right out the window.

Unfortunately, these bi-modal market expectation structures are now the rule rather than the exception in this, the Golden Age of the Central Banker. Why? Because monetary policy since March, 2009 has explicitly established itself as an emergency bridge for financial markets, a bridge

1, 2  - View Full Page