Bryan Caplan  

Kahneman and Renshon on Hawkish Biases

PRINT
Dean Baker on Something "All E... Robots, committees, or markets...
Kahneman and Renshon's chapter on "Hawkish Biases" (Thrall and Cramer, eds., American Foreign Policy and The Politics of Fear: Threat Inflation Since 9/11, 2009) is a big improvement over their Foreign Affairs piece on "Why Hawks Win."  Highlights include...

The role of overconfidence:
A group of researchers has recently documented the link between overconfidence and war in a simulated conflict situation (Johnson, McDermott et al. 2006). Johnson et al. conducted an experiment in which participants (drawn from the Cambridge, MA area, but not exclusively composed of students) played an experimental wargame. Subjects gave ranked assessments of themselves relative to the other players prior to the game, and in each of the six rounds of the game chose between negotiation, surrender, fight, threaten or do nothing; they also allocated the fictional wealth of their "country" to either military, infrastructure or cash reserves. Players were paid to participate and told to expect bonuses if they "won the game" (there was no dominant strategy and players could "win" using a variety of strategies). Players were generally overly optimistic, and those who made unprovoked attacks were especially likely to be overconfident (Johnson, McDermott et al. 2006: 2516).3

The consequences of positive illusions in conflict and international politics are overwhelmingly harmful. Except for relatively rare instances of armed conflicts in which one side knows that it will lose but fights anyway for the sake of honor or ideology, wars generally occur when each side believes it is likely to win -- or at least when rivals' estimates of their respective chances of winning a war sum to more than 100 percent (Johnson 2004: 4). Fewer wars would occur if leaders and their advisors held realistic assessments of their probability of success; that is, if they were less optimistically overconfident.
The opacity of intentions:
[P]eople tend to overestimate the extent to which their own feelings, thoughts or motivations "leak out" and are apparent to observers (Gilovich and Savitsky 1999: 167). In recent demonstrations of this bias, participants in a "truth-telling game" overestimated the extent to which their lies were readily apparent to others, witnesses to a staged emergency believed their concern was obvious even when it was not, and negotiators overestimated the degree to which the other side understood their preferences (even in the condition in which there were incentives to maintain secrecy) (Gilovich, Savitsky et al. 1998; Van Boven, Gilovich et al. 2003: 117). The common theme is that people generally exaggerate the degree to which their internal states are apparent to observers.

The transparency bias has pernicious implications for international politics. When the actor's intentions are hostile, the bias favors redoubled efforts at deception. When the actor's intentions are not hostile, the bias increases the risk of dangerous misunderstandings. Because they believe their benign intentions are readily apparent to others, actors underestimate the need to reassure the other side. Their opponents - even if their own intentions are equally benign -- are correspondingly more likely to perceive more hostility than exists and to react in kind, in a cycle of escalation. The transparency bias thus favors hawkish outcomes through the mediating variable of misperception.
The halo effect at work:
[I]ndividuals assign different values to proposals, ideas and plans of action based on their authorship. This bias, known as "reactive devaluation," is likely to be significant stumbling block in negotiations between adversaries. In one recent experiment, Israeli Jews evaluated an actual Israeli-authored peace plan less favorably when it was attributed to the Palestinians than when it was attributed to their own government, and Pro-Israeli Americans saw a hypothetical peace proposal as biased in favor of Palestinians when authorship was attributed to Palestinians, but as "evenhanded" when they were told it was authored by Israelis. In fact, the phenomenon is visible even within groups. In the same experiment, Jewish "hawks" perceived a proposal attributed to the dovish Rabin government as bad for Israel, and good for the Palestinians, while Israeli Arabs believe the opposite (Maoz, Ward et al. 2002).
My main disappointments.

1. While Kahneman and Renshon admit the possibility that hindsight bias clouds their judgment, they don't take strong steps to defuse it.  They could have gone through their paper bias-by-bias, asking, "Could this also lead to dovish bias?  Does it?"  I'm probably much more dovish than either Kahneman or Renshon, but behavioral economists really need to search harder for tensions between their psychology and their decidedly left-leaning politics.

2.. Kahneman and Renshon don't even cite Tetlock's work showing that political experts are bad forecasters, especially on issues that are controversial among experts.  You could say that this isn't really a "hawkish bias."  But combined with mildly deontological moral views on killing innocents, it is.

3. Kahneman and Renshon don't even mention Social Desirability Bias.  This seems like a big omission, because "Let's stand up for our country" sure sounds better than "Let's swallow our pride to avoid costly conflict," "Maybe we'll lose" or "Perhaps the other side has reasonable grievances against us."




Return to top