RECOBIA - Reduction of Cognitive Biases in Intelligence Analysis

Recobia gray background

Library

In this page you can see links to external resources.

26/08/2013

Source: DefenseMediaNetwork
Date: 19/08/2013

Accounting for their biases is one of the things America’s intelligence analysts struggle with. It’s a challenge Wirtz ,of the Naval Postgraduate School in Monterey, Calif., recently discussed in the journal Intelligence and National Security. Writing about postmortem investigations which examine the performance of intelligence gathering following intelligence failures and successes, Wirtz references the work of Columbia University professor Robert Jervis, who, in the 1970s, wrote his book Perception and Misperception in International Politics.

Read more: http://www.defensemedianetwork.com/stories/prevailing-beliefs-why-intelligence-analysis-sometimes-fails/

20/08/2013

By Aries B. Rebugio
Source: Small Wars Journal
Date: 12/08/2013

Since the inception of the intelligence discipline, there have always been problems associated with the collection and analysis of information.  Several of these problems occur because the human mind is easily influenced by internal and external factors.  Biases and perceptions can lead to a misconstrued view of reality and the way we process information. The purpose of this case study is to allow the reader to better understand the factors that ultimately led to errors in analysis by the US Intelligence Community (IC).  In doing so, the reader can be better aware of errors that can be prevented in the future and ultimately lead to better intelligence analysis.

Read more: http://smallwarsjournal.com/jrnl/art/bias-and-perception-how-it-affects-our-judgment-in-decision-making-and-analysis?goback=%2Egde_4690733_member_258265897

01/08/2013

Source: Sources and Methods
Date: 17/07/2013

A recent paper entitled The Secrecy Heuristic presents research substantiating a new heuristic that likely affects both intelligence professionals and decisionmakers: The idea that we “infer quality from secrecy” when it comes to intelligence analysis.  In other words, we will give the same information more value just because it is secret. This paper presents the three reasons that we fall victim to the The Secrecy Heuristic, and outlines the experimental evidence that validates the presence of this heuristic in information quality evaluation.

Read more: http://sourcesandmethods.blogspot.be/2013/07/do-intel-analysts-believe-info-is-good.html

30/06/2013

Source: “We’re Only Human” Blog

Date: 18 July 2013

We hire and train intelligence agents to weigh risks and make judgments, and most of us want to believe that these assessments are sound. But how rational are the individual men and women who are making the life-and-death decisions that influence national security? A new study raises some serious questions about our usual view of rationality, and how it applies to intelligence agents’ judgments about risk. Cornell University psychological scientist Valerie Reyna shows that experienced intelligence agents think irrationally about risk and loss, even when human lives are at stake.

Read more: http://www.psychologicalscience.org/index.php/news/were-only-human/spooky-judgments-how-spies-think-about-danger.html

25/05/2013

Source: The Scientific American

Date: 15 May 2013

There are two components to each bias. The first is the phenomenon itself: confirmation bias is your tendency to seek out confirmation information while ignoring everything else. The second is the belief that everyone else is susceptible toi thinking errors but not you. This itself is a bias - bias blind spot - a "meta bias" inherent in all biases that blinds you from you errors.

Read more: http://blogs.scientificamerican.com/mind-guest-blog/2013/05/15/the-bias-within-the-bias/

25/05/2013

Source: PsyBlog

Date: 23 May 2013

Understanding a psychological bias that illuminates how we negotiate, predict our emotions, agree a price and much more; and how to avoid it.

Read more: http://www.spring.org.uk/2013/05/the-anchoring-effect-how-the-mind-is-biased-by-first-impressions.php

24/05/2013

Source: HC Intelligence

Date: April 2013

The Cuban Missile Crisis was arguably the scariest event of mankind due to sheer destruction the involved nations' nuclear capabilities. It was perhaps the intelligence community that was expected to be the last line of defense before such crises happen. How did the US intelligence analysts fail to make accurate assessment of such grand military operation? What were some of the cognitive failures while perceiving and analyzing the adversary activities?

Read more: http://hcintelblog.blogspot.be/2013/06/perception-bias-and-cuban-missile-crisis.htm

22/05/2013

20 May 2013

Source: Psychology Today

The conventional wisdom in classical economics is that we humans are “rational actors” who, by our nature, make decisions and behave in ways that maximize advantage and utility and minimize risk and costs. This theory has driven economic policy for generations despite daily anecdotal evidence that we are anything but rational, for example, how we invest and what we buy. But any notion that we are, in fact, rational actors, was blown out of the water by Dr. Daniel Kahneman, the winner of the 2002 Nobel Prize for economics, and his late colleague Amos Tversky. Their groundbreaking, if not rather intuitive, findings on cognitive biases, have demonstrated quite unequivocally that humans make decisions and act in ways that are anything but rational.

 

Read more : http://www.psychologytoday.com/blog/the-power-prime/201305/cognitive-biases-are-bad-business

10/05/2013

2 May 2013

Source : CNN

Security technologist and author Bruce Schneier says media and general public shouldn’t heap criticism on the FBI and CIA for not keeping better track of Tamerlan Dsarnaev before the Boston bombings. By doing so, they have displayed a classic cognitive bias: hindsight bias.

Read more: http://edition.cnn.com/2013/05/02/opinion/schneier-boston-bombing/index.html

05/05/2013

29 April 2013

Source: HBR Blog Network

It is a profound irony that the more you know about a particular industry, and the more experience you gain in it, the more difficult it can be to move it forward with truly meaningful innovation. But it's true, thanks to something known as "the curse of knowledge" — one of the most vexing cognitive biases identified by psychologists and behavioral economists.

 

Read more: http://blogs.hbr.org/cs/2013/04/the_innovator_who_knew_too_muc.html
 

Pages