[Update: Link fixed.]

The following discussion from Wilson and Brekke’s “Mental Contamination and Mental Correction” was a revelation for me.  I abhor unedited blockquoting, but this passage is so compactly informative it’s hard to cut a word:

As noted by Gilbert (1991, 1993), there is a long tradition in philosophy and psychology, originating with Descartes, that assumes that belief formation is a two-step process: First people comprehend a proposition (e.g., “Jason is dishonest”) and then freely decide whether to accept it as true (e.g., whether it fits with other information they know about Jason). Thus, there is no danger to encountering potentially false information because people can always weed out the truth from the fiction, discarding those propositions that do not hold up under scrutiny. Gilbert (1991, 1993) argued persuasively, however, that human belief formation operates much more like a system advocated by Spinoza. According to this view, people initially accept as true every proposition they comprehend and then decide whether to “unbelieve” it or not. Thus, in the example just provided, people assume that Jason is dishonest as soon as they hear this proposition, reversing this opinion if it is inconsistent with the facts.

Under many circumstances, the Cartesian and Spinozan systems end up at the same state of belief (e.g., that Jason is honest because, on reflection, people know that there is no evidence that he is dishonest). Because the second, verification stage requires mental effort, however, there are conditions under which the two systems result in very different states of belief.  If people are tired or otherwise occupied, they may never move beyond the first stage of the process. In the Cartesian system, the person would remain in a state of suspended belief (e.g., “Is Jason dishonest? I will reserve judgment until I have time to analyze the facts”). In the Spinozan system, the person remains in the initial stage of acceptance, believing the initial proposition. Gilbert has provided evidence, in several intriguing experiments, for the Spinozan view: When people’s cognitive capacity is taxed, they have difficulty rejecting false propositions (see Gilbert, 1991, 1993).

If only I’d known about this line of research when writing The Myth of the Rational Voter, it would have been a lot easier to answer the complaint that rational irrationality is “psychologically implausible.”  As long as unbelieving propositions requires affirmative effort, the claim that people are more irrational when the cost of error is low readily follows.  The greater the effort required, the higher the stakes you need to motivate people to get the job done.  (And if the stakes remain low, the job probably won’t get done!) Perhaps even Jeff Friedman would have been placated.

Admittedly, I’d still be tempted to make this story more motivational and less cognitive.  We may have a cognitive bias towards assent, but it’s often our emotions that dissuade us from taking the effort to unbelieve.  That’s why providing free counter-arguments so rarely changes the minds of true believers. 

Wilson and Brekke then add an interesting evolutionary twist:

Gilbert (1991) argued that the Spinozan procedure of initial acceptance is an adaptive one. It has its roots in the perceptual system, he suggested, wherein it is to people’s great advantage to believe that what they perceive reflects reality (e.g., a car speeding in their direction is not a hallucination). In the realm of belief, there is a greater likelihood that new propositions are false (e.g., hearing that “New Blippo Detergent gets out even the toughest stains!”). But, even in the realm of belief, it can be highly efficient to initially believe what one hears. In Gilbert’s (1991) words, “just as perceptual systems enable timely action by capitalizing on the fact that most percepts are faithful, cognitive systems may achieve similar efficiency by capitalizing on the fact that most propositions are true” (p. 116). The initial acceptance of propositions, then, fits nicely into our first class of mental contamination: This state of affairs is highly adaptive much of the time but can lead one astray under certain conditions (e.g., when people are tired, preoccupied, or cognitively taxed, such that they do not have the resources to complete the second, “unacceptance” stage of the Spinozan process.

Beautiful.  I may even have to stop badmouthing Spinoza.