Psychology of Intelligence Analysis, Behavioral Finance and the CIA

Updated on

Taken Oct. & Nov. 2012 by Kyle Mowery, Founder of GrizzlyRock Capital, notes from The Psychology of Intelligence Analysis by Richards Heuer, Jr.

Chapter 1: “Thinking About Thinking”

  • Thinking analytically is a skill learned via doing (page 2)
  • Some of the CIA’s best analysts leaned through failures early in their careers (page 2)
  • The concept of bounded rationality means that we create simplified mental models, which are inherently more simple than reality (page 3)
  • “Intelligence analysts must understand themselves before they can understand others.” (page 4)
  • Problems can occur in individuals working alone, in small groups, and large organization. Heuer is focusing on individuals as these issues “are probably the most insidious.” (page 6)

“Perception: Why Can’t We See What Is There to Be Seen?”

  •  “We tend to perceive what we expect to perceive” (page 8)
  •  “Mindsets tend to be quick to form but resistant to change” (page 10)
  •   New information is applied to existing thought or model (page 11)
  •   Initial exposure to blurred or ambiguous stimuli interferes with accurate perception even after more and better       information becomes available. (page 13)
  •   A prudent analysis system should do the following: (page 16)

o Encourage analysts to clearly delineate assumptions and specify degrees of information used, including the source of uncertainty involved

o Periodically examine key problems from the ground up, in order to avoid the pitfalls of incrementalism

o Institutionalize alternative points of view (Kyle Mowery note: One can prove something false but true is much harder)

Chapter 3: “Memory: How Do We Remember What We Know?”

  •  Having experiences physically changes your brain (page 20)
  • “The key factor in transferring information from short term to long term memory is the development of associates between the new information and schemata already available in memory. This in turn, depends on two variables: the extent to which the information to be learned relates to an already existing schema, and the level of processing given to the new information. (page 23)
  •   Memories are seldom reassessed or reorganized retroactively in response to new information (page 29)

Chapter 4: “Strategies for Analytical Judgment: Transcending the Limits of Incomplete Information”

  • “Judgment” is what analysts use to fill in gaps in their knowledge. (page 31)
  •  “Reasoning by comparison is a convenient shortcut, one chosen when neither data nor theory are available for other analytical strategies, or simply because it is easier and less time consuming than a more detailed analysis.” (page 38)
  •  Analysts can gain objectivity by making assumptions explicit so that they may be examined and challenged, not by vain efforts to eliminate them from analysis. (page 41)
  •  “When analysts usually cannot apply the statistical procedures of scientific methodology to test their hypotheses, they can and should adopt the conceptual strategy of seeking to refute, rather than confirm hypotheses. (page 46)
  •  “An optimal analytical strategy requires that most analysts search for information to refute their favorite theories, not employ a satisfying strategy that permits acceptance of the first hypothesis that seems consistent with the evidence.” (page 48)

Chapter 5: “Do You Really Need More Information?”

  •  “Once an experienced analyst has the minimum information necessary to make an informed judgment, obtaining additional information generally does not improve the accuracy of their estimates” (page 51)
  •  More information leads analysts to be more confident – but not more accurate in their analysis. Horse handicapping example: more info but same prediction accuracy & increased confidence. (page 52)
  • Experienced analysts have an imperfect understanding of what information they actually use in making judgments. (page 52)

Chapter 6: “Keeping an Open Mind”

  •  New ideas are the beginning of creative process (page 65)
  •  “Thinking backwards (imagining yourself in the future discussing events) changes the focus from whether something might happen, to how it might happen.” (page 71)
  •  The use of a devil’s advocate exposes conflicting interpretations and shows alternative viewpoints (page 72)
  •  Creativity can be developed (page 76)
  •  Optimal results come from alternative between individual thinking and team effort, using group interaction to generate ideas that supplement individual thought. (page 78)
  •  Questioning attitude is a pre-requisite to a successful search for new ideas (page 81)

Chapter 7: “Structuring Analytical Problems”

  • Problems must be decomposed, as we cannot keep all relevant issues in our consciousness at the same time (page 85)
  •  Ben Franklin used the following tool for decision making: he would write pros and cons of a decision on each side of a piece of paper as they came to mind over a few days. He would then strike pros and cons of equivalent weight and see where the balance of the items lay. (page 87)

Chapter 8: “Analysis of Competing Hypothesis”

  • Most analysts selectively consider data which supports their intuitively assumed answer (page 95)
  •  Analysis of Competing Hypothesis is an 8 step procedure, grounded in cognitive psychology, decision analysis, and the scientific method. (page 97)

o Step 1: Identify the possible hypothesis to be considered. Use a group of analysts with different perspectives to brainstorm the possibilities.
o Step 2: Make a list of significant evidence and arguments for and against each hypothesis. Especially note the lack of things which happened, as opposed to what did not happen (i.e. Sherlock Holmes noting the dog did not bark)
o Step 3: Prepare a matrix with hypotheses across the top and evidence down the side. Analyze the “diagnosticity” of the evidence and arguments. That is, identify which items are most helpful in judging the relative likelihood of the hypotheses.
o Step 4: Refine the matrix. Reconsider the hypotheses & delete evidence & arguments that have no diagnostic value
o Step 5: Draw tentative conclusions about the relative likelihood of each hypothesis. Proceed by trying to disprove the hypotheses rather than to prove them.
o Step 6: Analyze how sensitive your conclusion is to a few critical items of evidence. Consider the consequences for your analysis if that evidence were wrong, misleading, or subject to a different interpretation.
o Step 7: Report conclusions. Discuss the relative likelihood of all the hypotheses, not just the
most likely one.
o Step 8: Identify milestones for future observation that may indicate events are taking a
different course than expected

  • Analysis of Competing Hypothesis ensures an appropriate process of analysis. While it increases the odds of success and leaves an audit trail of thought, it isn’t a silver bullet. (page 109)

Chapter 9: “What are Cognitive Biases?”

  • Cognitive biases are consistent and predictable mental errors, caused by our simplified information processing strategies (page 111)
  •  A cognitive bias does not result from any emotional or intellectual predisposition towards a certain judgment, but rather from subconscious mental procedures for processing information (page 111)

Chapter 10: “Biases in Evaluation of Evidence”

  • Analysts seldom consider the absence of evidence in decision making (page 115)
  •  Information people receive directly (by their own eyes & ears) is more salient than second-hand information (page 116)
  •  Seeing should not always mean believing. (page 117)

Chapter 11: “Biases in Perception of Cause & Effect”

  •  Analysts have a need to assign order to their environments. As such, often analysts seek causes for what are actually accidental random phenomena. (page 127)
  •  Significant bias towards assigning outcomes, based on the actor in a situation, as opposed to the situation the actor is surrounded by. (page 127)
  •  Overestimation bias is a significant problem (page 132)
  •  Additional discussion of actor vs. situation (page 135)
  •  Discussion on correlation & causation (page 140)

Chapter 12: “Biases in Estimating Probabilities”

  •  People overestimate the “availability rule” in estimating probabilities. This is due to the rate at which analysts can imagine the scenario. (page 147)
  •  Analysts often irrationally anchor to a certain starting point, regardless of where the data indicates they should be estimating (page 147)
  •  Base rate fallacy: the base rate of events occurring is neglected by analysts, unless an understandable causal relationship is perceived (page 157)

Chapter 13: “Hindsight Bias in Evaluation of Intelligence Reporting”

  •  Analysts normally overestimate the accuracy of their past judgments (page 161)
  • Overseers conducting post-mortem analyses of events typically conclude that events were more readily foreseeable than in fact was the case. (page 161)
  • Overconfidence bias cannot be overcome by a simple admission to be more objective. Like optical illusions, cognitive biases remain compelling, even after we become aware of them. (page 162)
  •  People can remember previous judgments, but struggle to reconstruct their previous thinking. (page 162)
  •  When events occur, people tend to overestimate the extent to which they had been previously expected to occur (and vice versa with events not occurring, people underestimate their assumptions of these non occurring events) (page 163)
  • Events generally seem less surprising than they should on the basis of past estimates. (page 164)
  •  Hindsight bias increases with the time passing since the event occurring (page 164)

Chapter 14: “Improving Intelligence Analysis”

Use a checklist (page 173)

o Define the problem
o Generate hypothesis
o Collect information
o Evaluate hypothesis
o Select most likely hypotheses
o Ongoing monitoring

  • Management of analysis should include support for staff in understanding the cognitive process in intelligence analysis, as well as more focus on thinking skills and training. (page 178)

Leave a Comment