RECOBIA - Reduction of Cognitive Biases in Intelligence Analysis

Recobia gray background


In this page you can see links to external resources.


Farnam Street Blog

Shane Parrish

15 September 2014

Each day we are confronted with hundreds, probably thousands of decisions. Most of which are insignificant or unimportant or both. The human brain has evolved to hide from us things we are not paying attention to. In other words, we often have a cognitive blind spot: We don’t know what we’re missing because our brain can completely ignore things that are not its priority at the moment— even if they are right in front of our eyes. Cognitive psychologists have called this blind spot various names, including inattentional blindness.

Read more:


Harvard Business Review Blog

by Srini Pillay

July 17, 2014

In hindsight, many risks seem obvious.  And when we do take the time to evaluate potential risks, there is often not much that is profound about them.  Yet so many of us fall prey to unforeseen risks, believing that they came out of nowhere or that they could not have been anticipated.  While this may be true in some cases, most of the time risk blindness occurs due to the way our brains are wired. Here are three reasons why we’re blind to risk, and what we can do about it.

Read more:

by Kate Craford
Harvard Business Review
13 June 2014
This looks to be the year that we reach peak big data hype. From wildly popular big data conferences to columns in major newspapers, the business and science worlds are focused on how large datasets can give insight on previously intractable challenges. The hype becomes problematic when it leads to what I call “data fundamentalism,” the notion that correlation always indicates causation, and that massive data sets and predictive analytics always reflect objective truth. Sadly (..) data and data sets are not objective; they are creations of human design. We give numbers their voice, draw inferences from them, and define their meaning through our interpretations. Hidden biases in both the collection and analysis stages present considerable risks, and are as important to the big-data equation as the numbers themselves.
By Benjamin Brown
Circle City Conference 2014
When gathering open source data and transforming it into actionable intelligence, it is critical to recognize that humans are not objective observers. Conscious and unconscious assumptions drive analysts' choices about which data to analyze and how much importance to ascribe to each resource. Furthermore, analysts' personal conceptual frameworks about reality and how the world works can undermine the process of objectively translating data into intelligence.  These implicit assumptions, otherwise known as cognitive biases, can lead to missed data, skewed intelligence, illogical conclusions, and poor decision making. This presentation illustrates cognitive biases relevant to OSINT and what can be done about them.
By Roger Davies
June 4 2014
ISML Insights
Here’s a (revised) link to a Malcolm Gladwell talk about the analyst Konrad Kellen. Kellen was an intelligence analyst with a contrary view of the Vietnam War and the thinking of the North Vietnamese, and the story is interesting. Once again, the importance of recognising and ignoring cognitive biases comes through strongly.
By Adam Moscoe
May 30 2014
At the core of the Iran intelligence failure is a series of four unchecked assumptions and prefabricated beliefs combined with an absence of methodological rigour. Failure to ‘connect the dots’ left the US unable to advance its interests. However, when examining analytical failures, one must be attentive to hindsight bias and the problem of noise – the challenge of discerning warning signals amid chaos. Since they must defend their views, analysts are more likely to be thorough, systematic, and conscious of biases or subjective beliefs. Sharpened analysis could enable agents to mark indicators for confirming or denying predictions. For example, they could have set a threshold level of violence after which they would have committed to revising estimates of the Shah’s strength. However, Robert Jervis, in a recently declassified post-mortem, is quick to note that this style of competitive analysis must be valued and rewarded in ICs in order to bring about meaningful change.
By Morgan Housel
May 30, 2014 
The Motley Fool
People disagree with each other because they think the other side is biased into making bad decisions. They rarely assume that they, themselves, might be just as biased. Psychologists have a name for this: blind-spot bias. It's a bias that prevents us from realizing how biased we are. And it is pervasive in investing.

10 March 2014
by Lissy Torres
Michigan State University

While cognitive bias can affect everyone, it is especially dicey among intelligence agents. Dr. Yu-Hao Lee, Michigan State University Media and Information Studies Ph.D. alum, is currently working with a team at the University of Oklahoma on a game to reduce cognitive bias among intelligence agents. For Yu-Hao, this is a match made in heaven, landing in his field of research—motivation and psychology of serious games. The two part game: MACBETH I and MACBETH II, is being developed for the International Advanced Research Projects Activity (IARPA), the research head of the intelligence community for the CIA.

Read more:

by Oliver Burkeman
28 February 2014 
The Guardian
Nobody’s political opinions are just the pure, objective, unvarnished truth. Except yours, obviously. Read more:
by Joe Harris
February 25, 2014
Small Wars Journal

The relationship between the intelligence community (IC) and policymakers has been a complex one.  Good intelligence is arguably the backbone of good and proper policymaking.  The bureaucratic politics and organizational dynamics of the policymaker play a major role in how decisions are made and thus, how intelligence is used to support those decisions.  However, a thorough investigation of this relationship may provide that the true failure is not one of poor intelligence but one of policymaker bias, bureaucratic politics, organizational structures, and information processing errors. For the purpose of this research paper the author will zero in one specific event, the 2012 Benghazi attacks, and seek to the answer the following research question: What were the biases present in the policy making community before and immediately after the 2012 Benghazi terrorist attacks and how did they affect the intelligence process?

Read more: