What can an interesting and surprising experiment with finance students and finance professionals tell us about financial decisions and how to maximize extracting returns from low information content systems?
It is well known that humans are bad at estimating probabilities. We overestimate how likely very low probability events are(1) and we get confused estimating the relative probabilities of events. Many of the well-known behavioural biases in finance come from incorrect estimation of probabilities but even worse than this, it turns out that even when we know the probability distribution our use of probabilities is very poor indeed.
Haghani and Dewey constructed a coin flipping experiment. The participants were told that the coin they were flipping had a 60% chance of landing heads and a 40% chance of landing tails. They were given $25 and told they could bet as much as they wanted on each toss and they could continue to play the game for 30 minutes and could toss the coin as often as they liked. The maximum payout for each participant was capped at $250.(2)
The 61 participants were either finance and economics students or young financial industry professionals. They were fully briefed on the game and, unlike so many psychology experiments, the experimenters told them the truth! You really were playing a game with a 60% chance of winning.
You can play the game yourself but before you do, take a minute and think about what your strategy should be.
Even without a deep grounding in probability theory, some rules seem fairly sensible.(3)
- It makes sense to bet on heads most, if not all of the time
- Since the game is biased in your favour, it makes sense to play as often as you can
- It makes sense not to bet your entire stake on a single coin toss (because you’ve got a 40% chance of losing everything)
- Betting a small amount every time ($1 say) is unlikely to maximise winnings
- If you’ve made money up to now then you can probably afford to bet a bit more
These all seem like pretty sensible rules. If you’ve got 30 minutes to spare then give it a go…although you may save yourself time by reading the rest of this post first.
What is the optimal strategy?
Firstly, you should never bet on tails. Never ever ever ever. That should be pretty obvious. How much to bet is a little more complicated but not much more.
With some fairly loose but sensible assumptions(4) the optimal fraction F?
of your wealth to bet is given by
is the probability that you win. This is the famous Kelly Criterion, which was originally formulated in 1956 and is very well known both in the gambling community and the finance community.(5)
This has all the properties that one would expect. If you have no information (it’s a 50/50 bet), then don’t bet. If you’re 100% sure, then bet the farm. And as your wealth increases or decreases then bet more or less proportionally.
How did the subjects do?
Figure 1: Source: Haghani and Dewey (2016)
In Figure 1 we have five, fairly arbitrary, categories:
- “Bust” is less than $2 at the end
- “Down” are those people who ended up with less than $25
- “Poor” is between $25 and $100
- “Better” is between $100 and $250
- “Maxed” is hitting $250
During the rest of this piece, we’re going to continue to use these groupings.
The big surprise is that so many people (17) went bust. These are educated financial professionals who managed to go completely bust in a game which unlike the “real” financial world, was simple, understandable and massively rigged in their favour. Nearly as surprising is that 18 of the subjects bet their entire bank roll on a single flip thus giving themselves a 40% chance of complete ruin.
The authors also note that despite an incredibly clear description of the probabilities, nearly half of the participants bet on tails more than 5 times, which is almost laughably sub-optimal. If you have read any behavioural finance or psychology literature then experimenting with betting on tails once or twice may actually be a smart strategy. It’s always possible in these experiments that the researchers are actually investigating something else. Think of the infamous Milgram Experiment! I can imagine thinking “I wonder if something interesting will happen if I click on tails”.(6) But five times?
This is a well designed and simple experiment which uncovers some very interesting(7) issues. The authors of the paper conclude
“There is a meaningful gap in the education of young finance and economics students when it comes to the practical applications of the concepts of utility and risk taking.”
We would strongly support this conclusion and wholeheartedly support a better link between theory and practical applications in finance, statistics and computer science. Nevertheless, our purpose here is somewhat different: we want to look at the results of this experiment through a different lens. Do the results indicate a better way to invest now? The wheels of academia grind slowly and investors can’t wait for a new generation of better educated practitioners to arrive.
How good is good?
Whilst a 60% chance of winning seems like quite a low probability, in the field of finance (and gambling) this is about as good as it gets. Blackjack players who can perfectly card count (8) can expect a 2.5% edge on every hand. This is not quite equivalent to 52.5/47.5 odds due to the variable bet sizes but it’s a good approximation. A 2.5% financial upside probably doesn’t offset the downside of the casino taking you outside and breaking your legs when they discover you card counting…
As we have repeatedly said before, we humans are bad at estimating probabilities and also bad at estimating what constitutes a good performance. It would be very