Philip Tetlock – Edge Master Class 2015: A Short Course In Superforecasting [Class V]

Updated on

Philip Tetlock – Edge Master Class 2015: A Short Course In Superforecasting [Class V] by John Brockman,Russell Weinberger and Nina Stegeman, Edge.org

Condensing it All Into Four Big Problems and a Killer App Solution

With Philip Tetlock [9.22.15]


 

Edge Master Class 2015 with Philip Tetlock
— A Short Course in Superforecasting
 —

| Class 1Class 2 | Class 3 | Class 4 | Class 5 |


 

Philip Tetlock:  If you turn to session six, slide 117-118, you’re going to see a little piece on the seductive power of scenarios. Imagine you’ve got one of these between subjects designs in which half of the people read the top slide, half of the people read the bottom slide, then they make a judgment about the plausibility or probability of this outcome.

The first one, on slide 117, is the likelihood of a flood anywhere in North America in the next thirty years killing 1000 people or more. The next one is the likelihood of a flood anywhere in North America, triggered by an earthquake causing a dam to collapse in the next thirty years killing 1000 people or more.

Philip Tetlock

Philip Tetlock

You can imagine randomly assigning half of the people read one version, half read the other. A moment’s contemplation reveals that it would be very odd if people judged the bottom slide—the more detailed one—to be more probable than the top one.

It’s probably obvious to everybody around this room because this is an analytically high-powered group, but it’s not obvious to most people. It’s an important part of forecaster training, and it’s an important part of becoming a superforecaster—being aware that there is this similarity across three superficially very different things. There’s a similarity in how the subjects in Danny’s contingent valuation experiments judge the value of ducks and lakes in Ontario—scope sensitivity, a similarity in the problems regular forecasters have in distinguishing the likelihood of Assad falling in six months versus twelve months, and the dam scenarios. Is it clear how these things are related?

This also ties in somewhat to this vexing problem of people’s judgments of explanations and forecasting accuracy because we are tempted by rich narratives. It’s easy for us to transport ourselves imaginatively into the more detailed narrative that’s more interesting, more engaging. It’s a better story. And of course, it’s less probable. You don’t want to get suckered by attribute substitution. You don’t want to replace the question, is this an accurate forecast? with, is this interesting story? It is tempting in many situations to do that. I make that point on slide 119.

Philip Tetlock

Philip Tetlock

On slide 120 I talk about something we talked a bit about yesterday, which is that scenarios can be a source of bias. They can cause you to attribute too much probability to too many possibilities. You can violate the axioms of probability. Probabilities can start adding up to more than one; that’s a warning sign that you’re incoherent. But you can fight fire with fire, and you can use scenarios when you think backward in time. I mentioned that yesterday in the connection with counterfactuals and hindsight bias. Getting people to imagine counterfactual alternatives to reality is a way of counteracting hindsight bias.

Hindsight bias is a difficulty people have in remembering their past states of ignorance, or remembering what they thought before. Counterfactual scenarios can reconnect us to our past states of ignorance and that can be a useful, humbling exercise. In the long run, it’s good mental hygiene and it’s useful for debiasing.

Our most recent training modules for superforecasting emphasize this bicycle riding metaphor and balancing metaphor and balancing offsetting errors. Some errors are more likely in some environments than others. Some errors may be more likely in general than others. Each of the errors on slide 121 are logically and psychologically possible, and it’s helpful that people be aware of them. You have the perils of over-adjusting to evidence or under-adjusting to evidence. Over-adjusting to evidence is when people see the big crowds forming in Moscow in early 2012, and they think Putin is finished. Under-adjusting to evidence can also occur in many situations.

Philip Tetlock

See full article here.

Leave a Comment