Bryan Caplan  

Overcoming Bias: Some Empirics

PRINT
An Article on Libertarianism... An Influential Blogger has Die...

Wilson and Brekke's justly famous article also contains an eye-opening survey of the empirics of "mental correction," better known at GMU as overcoming bias.  While I'm sure the sub-field has advanced since 1994, it's amazing how much was already known at the time:

A number of studies have attempted to reduce biases in information processing and judgment by forewarning people about or drawing their attention to potentially biasing information and examining the extent to which they are able to avoid the unwanted effects of this information... In general, these studies have revealed a wide range of seemingly contradictory effects. Some studies have shown that an increase of people's awareness eliminates mental contamination; some have found that awareness causes people to adjust insufficiently, leading to undercorrection; some have indicated that awareness causes people to adjust their responses too much, leading to overcorrection; and some have shown that awareness does not cause people to adjust their responses.

One case they discuss in detail is efforts to correct for priming effects.

Consider, for example, the classic priming effect, whereby people's judgments shift in the direction of the primed category (e.g., if the category of "kindness" is accessible, people typically rate another person as more kind than they normally would; Higgins et al., 1977; Srull & Wyer, 1989). Recent studies have shown that making people aware that the category has been primed by an arbitrary event causes them to adjust their responses (Lombardi, Higgins, & Bargh, 1987; Martin, 1986; Martin, Seta, & Crelia, 1990). Interestingly, however, increasing awareness does not make the priming effect disappear; it often reverses, resulting in a contrast effect (Lombardi et al., 1987; Martin, 1986; Martin et al., 1990). For example, if people realize that kind thoughts are accessible for arbitrary reasons, they end up rating the target person as less kind than they normally would.

If Wilson and Brekke were economists, they'd probably be more inclined to treat a mixture of undercorrection and overcorrection as evidence in favor of human rationality.  Either way, it's fascinating to discover that this research not only exists, but has been sitting on the shelf for decades.  Why didn't I hear about this in grad school?


Comments and Sharing






COMMENTS (3 to date)
jc writes:

An excerpt from the comments section of Arnold's "recognized but marginalized" post...(Bryan, you seem to have a lot of 'books to write' already in the queue. For me, this would be an interesting one you might consider adding.)

Unique histories, cultures, and sets of vested interests need to be taken into account, as well as widespread cognitive biases that seem to undermine legitimate attempts at positive institutional change. Regarding cognitive bias, evolution didn't seem to equip us, for example, w/ an appreciation for markets, while it did render us quite prone to populism. Too bad...

(Btw, I'd love to see Bryan, Robin and a friendly behaviorist take a stab at coauthoring a book that explains *why* anti-market bias exists and how it might be overcome, why all it takes is one simple intermediate step for people to condone and even call for coercive behavior they'd decry if they had to do the coercing, etc., i.e., a book about how cognitive biases cause us to reject institutions the literature suggests make us all better off. North, of course, sometimes puts down libertarians. Heck, he is a former Marxist - though his favorite economist these days is Hayek. When I read that literature, though, the stuff that seems to work sure sounds to me like what a libertarian would vote for, were those institutions offered.)

fundamentalist writes:

Research in public relations shows that people differ on how they accept new knowledge. Some depend upon an authority while some think for themselves. Those who depend on authority must get the new information from an authority they respect or they won't process it. Those who think for themselves must be able to see how the new information fits with what they already know.

BTW, depending upon an authority to filter one's information is perfectly rational. It's the old division of labor thing. Most people don't have the time or the inclination to become experts in everything and it would be foolish to try. So they become experts at some things and rely on experts for other things.

Christian Galgano writes:

I recommend the whole video, but check out 11:45 on for what's almost the avant garde of confirmation bias and moral psychology: http://www.edge.org/3rd_culture/morality10/morality.haidt.html#haidt-video

Professor Haidt will be sending you the final paper I wrote that synthesizes the state of confirmation bias and moral psychology with TMORTV in less than two weeks.

I read your exchange with Professor Haidt from the spring, and he has updated his theory of morality since then (partly with the aid of his bet...go Haidt/Caplan).

The video may also answer why you didn't hear about Wilson (UVA) and Brekke at Ptown.

--Christian, psych/econ major at UVA

Comments for this entry have been closed
Return to top