The Folly of Prediction (World Cup Style)

Updated on

The Folly of Prediction (World Cup Style) by Attain Capital

Anybody else having a problem with these seemingly scientific “odds of winning” statistics being thrown about by Bloomberg and Five Thirty Eight around the World Cup? Here’s the chance of England beating Uruguay today from Bloomberg:

And Nate Silver’s new FiveThirtyEight Blog’s advanced look:

Prediction worldcup style

Now, we’re big fans of Nate Silver, and wouldn’t mind seeing more data in journalism as a whole – but this whole exercise of providing odds of winning seems a bit off to us. For one, it’s been a little annoying to hear people around town saying things like Australia has a 9% chance of winning versus the Netherlands, like they just measured the temperature outside or are quoting a stock price. Moreover, there have been some really off “predictions”, like Spain having a 53% chance to beat the Netherlands in its opening game, but losing 5 to 1 and being eliminated from the whole tournament in just two games. For you non soccer fans, that’s like the Seattle Seahawks making it back to the playoffs next year and losing 56 – 3.

The temperature, your height, and IBM’s stock price are all measurable things, with actual scientific answers. The percentage probability that England will beat Uruguay is not measurable, and there is no correct value for it. Despite the pretty graphics and official sounding “predictions”, there’s actually no statistic you can assign to such a prediction. There’s just three possible outcomes – a win by Team A, a win by Team B, or a tie. So technically, the odds of winning are 1 out of the three possibilities, or 33%.   But the data journalists take it much further – with Nate Silver explaining that they use the SPI algorithm (Soccer Power Index) to come to such predictions, which is in fact quite thorough, looking at each players stats and performance between international and national play. (more on SPI see here and here).

Mr. Silver earned the right to be making soccer predictions by doing a great job predicting elections. But predicting elections is a bit different, as there is statistical data of registered voters, election turnout data based on weather, population sizes of counties, and people who actually get asked who they will vote for. You get “inside information,” in a way. You get to know how the players are going to play the game before the game is played. Nate Silver’s genius was stripping out all the biases and noise in those polls and zeroing in on what the right ‘inside information’ was.

But analyzing the past performance of a soccer team and its players isn’t “inside information,” it’s past performance. And past performance has a nasty way of being a terrible indicator of future performance, especially when you get 11 different people on each team. At the very least, that’s 22 degrees of freedom put into the mix. Add in what they ate the day before, how their feeling, whether  they got in a fight with their significant other, the weather, the referee, and all of the rest; you’re talking hundreds if not thousands of moving parts. That is – as they say – why they play the game.

Why do we care?  Because there’s a whole lot of prediction nonsense in the financial markets that is almost continuously wrong:

  • Stock price targets
  • End of year Dow or S&P targets
  • GDP targets
  • Unemployment
  • Housing Starts
  • Crop Reports
  • And the list goes on and on…

Where is Nassim Taleb on data journalism and spurious correlations and the role of luck and all the rest? Where is Barry Ritholtz on the folly of prediction (see here, here, and here)?

To be fair, Nate Silver does point out the more complex algorithms does mean there is more possibilities for errors, and he has done studies “checking their work”. Still, the world cup predictions are just that, predictions. They are saying more that this team will beat that team about 65% of the time, all else being equal and if they played 10,000 times – than they have a 65% chance of beating the other team today! It’s semantics, but we think it’s important to bear in mind.

The World Cup is one thing, but what happens when the data journalists start posting likelihoods of the market finishing the year up such and such, and real people with real money act on it. As Freakonmics put it in a podcast a few years ago:

It’s impossible to predict the future, but humans can’t help themselves. From the economy to the presidency to the Super Bowl, educated and intelligent people promise insight and repeatedly fail by wide margins.

Maybe the data journalists should be regulated and have to put big ‘Past Performance does not necessarily guarantee future results’ disclaimers on their articles, or at least admit that while fun to do, and good for page views, the odds of correctly assigning all of these probabilities is about the same as the 1 in 9.2 quintillion chances of perfectly filling out the March Madness bracket.

Leave a Comment