Bryan Caplan  

Bias, Assent, and the Psychological Plausibility of Rational Irrationality

PRINT
Unintended Consequences of Man... Recognized but Marginalized...

[Update: Link fixed.]

The following discussion from Wilson and Brekke's "Mental Contamination and Mental Correction" was a revelation for me.  I abhor unedited blockquoting, but this passage is so compactly informative it's hard to cut a word:

As noted by Gilbert (1991, 1993), there is a long tradition in philosophy and psychology, originating with Descartes, that assumes that belief formation is a two-step process: First people comprehend a proposition (e.g., "Jason is dishonest") and then freely decide whether to accept it as true (e.g., whether it fits with other information they know about Jason). Thus, there is no danger to encountering potentially false information because people can always weed out the truth from the fiction, discarding those propositions that do not hold up under scrutiny. Gilbert (1991, 1993) argued persuasively, however, that human belief formation operates much more like a system advocated by Spinoza. According to this view, people initially accept as true every proposition they comprehend and then decide whether to "unbelieve" it or not. Thus, in the example just provided, people assume that Jason is dishonest as soon as they hear this proposition, reversing this opinion if it is inconsistent with the facts.

Under many circumstances, the Cartesian and Spinozan systems end up at the same state of belief (e.g., that Jason is honest because, on reflection, people know that there is no evidence that he is dishonest). Because the second, verification stage requires mental effort, however, there are conditions under which the two systems result in very different states of belief.  If people are tired or otherwise occupied, they may never move beyond the first stage of the process. In the Cartesian system, the person would remain in a state of suspended belief (e.g., "Is Jason dishonest? I will reserve judgment until I have time to analyze the facts"). In the Spinozan system, the person remains in the initial stage of acceptance, believing the initial proposition. Gilbert has provided evidence, in several intriguing experiments, for the Spinozan view: When people's cognitive capacity is taxed, they have difficulty rejecting false propositions (see Gilbert, 1991, 1993).

If only I'd known about this line of research when writing The Myth of the Rational Voter, it would have been a lot easier to answer the complaint that rational irrationality is "psychologically implausible."  As long as unbelieving propositions requires affirmative effort, the claim that people are more irrational when the cost of error is low readily follows.  The greater the effort required, the higher the stakes you need to motivate people to get the job done.  (And if the stakes remain low, the job probably won't get done!) Perhaps even Jeff Friedman would have been placated.

Admittedly, I'd still be tempted to make this story more motivational and less cognitive.  We may have a cognitive bias towards assent, but it's often our emotions that dissuade us from taking the effort to unbelieve.  That's why providing free counter-arguments so rarely changes the minds of true believers. 

Wilson and Brekke then add an interesting evolutionary twist:

Gilbert (1991) argued that the Spinozan procedure of initial acceptance is an adaptive one. It has its roots in the perceptual system, he suggested, wherein it is to people's great advantage to believe that what they perceive reflects reality (e.g., a car speeding in their direction is not a hallucination). In the realm of belief, there is a greater likelihood that new propositions are false (e.g., hearing that "New Blippo Detergent gets out even the toughest stains!"). But, even in the realm of belief, it can be highly efficient to initially believe what one hears. In Gilbert's (1991) words, "just as perceptual systems enable timely action by capitalizing on the fact that most percepts are faithful, cognitive systems may achieve similar efficiency by capitalizing on the fact that most propositions are true" (p. 116). The initial acceptance of propositions, then, fits nicely into our first class of mental contamination: This state of affairs is highly adaptive much of the time but can lead one astray under certain conditions (e.g., when people are tired, preoccupied, or cognitively taxed, such that they do not have the resources to complete the second, "unacceptance" stage of the Spinozan process.

Beautiful.  I may even have to stop badmouthing Spinoza.


Comments and Sharing





COMMENTS (12 to date)
Less Antman writes:

The link doesn't work, so I'll reserve judgment on the accuracy of your description ... no, wait ... I'll assume it is true since it is too inconvenient to verify it.

Hyena writes:

This paper is backed up by almost two decades of Internet debate in which people assume the truth of anything you say. If you watch Interweb debate closely, you will find that people tend to quarantine rather than reject inconvenient assertions.

I submit that people have an aggressive hyperbolic discount curve here. People often don't even Google it or check Wikipedia. Of course, that may mean that it's calculatory: why lie when you'll be so easily discovered?

I'd love to see if the paper deals with that question. Does belief acceptance change when the ease of verification changes?

jb writes:

Less Antman +1!

Once I realized that the Internet is full of lies (back in 1989ish) I started practicing the assumption that everything I hear is a lie, and I must spend mental effort before I believe it to be true. That practice has spread to the real world for me, at least in some cases.

However, I do that far less in practice than I should.


So back to a previous post - maybe people are easily manipulable when they're tired?

Jason Brennan writes:

I find it bizarre that so many people claim rational irrationality is psychologically implausible. There is rather extensive research into the phenomenon of "motivated reasoning", and what you call rational irrationality is an instance of motivated reasoning. Most of the people whom I have encounter who reject the rational irrationality thesis do so either 1) because it conflicts with certain models of human rationality they are committed to a priori, or 2) they do not know much about psychological research about bias. In neither cases is there a reason to take a person's opinion seriously. In case 2, the person is ignorant, and in case 1, the person is irrational.

Brian Clendinen writes:

I think this has to due with the same thought process of why stereotypes work. Typically when one hears something it is normally correct or at least most of what is said is true. So a persons accepting the truth at face value makes a lot of sense. I would think in a world were a majority of what is said were lies, the opposite would be true. I know there are certain people or organizations, when they discuss certain subjects (because of past experiences I had with them), I reject every they say as false until I can prove it otherwise. However, if I had no experience with the entity Spinoza theory would hold true.

Really I think this shows why schools and media are so important at shaping the voters mindset and issues they don't find important. It goes to show the media’s liberal bias is actually very rational despite the profit it destroys in the long run. However, I am glad it is a lot less than it used to be.

Jim writes:

I found a pdf of the paper here


http://people.virginia.edu/~tdw/wilson&brekke.1994.pdf

fundamentalist writes:

Research in public relations shows that people decide what is true emotionally based on what they want to be true and then search for confirming evidence. So if people like Jason, they'll tend to believe he is honest and if they don't they'll believe he is dishonest no matter the evidence.

Very few people care about truth.

In addition, there are three different levels of believe. If I drew three concentric circles with the most important beliefs at the center and least important on the outer circle, the outer circle would have opinions about things that are easily changed. Opinions don't require a change in attitude or belief system. An opinion might be which brand of toothepaste to use.

On the next inncer circle would be attitudes. An attitude might be the decision to use toothepaste or not, or issues on hygene. Attitudes are hard to change and require a significant emotional event in one's life or about 3 years of counseling to change.

At the center are world views, which are almost impossible to change.

As for rational irrationality, that's a clever devise to get people's attention but it's impossible and doesn't make any sense anyway. A rational person won't knowingly make himself worse off. That doesn't mean a rational person can't be wrong; it just means he may not know everything he needs to know. Caplan's definition of irrationality is more like ignorance.

As I understand it, the philosopher Colin McGinn argues that people do not choose what they believe. It seems that some people are quite resistant to logical arguments, while others find logic irresistible. Perhaps neither Spinoza nor Descartes's model of belief is sufficient as a general explanation of what and how people come to believe.

By the way, speaking of rational irrationality seems to be pretty much in the same neck of the woods as speaking of red non-red and ugly not ugly; just a bit nonsensical, perhaps? But wait; reaching that conclusion would require logic --- or is it non-logic?

Jason Brennan writes:

I don't understand why people think rational irrationality is incoherent. It's a very simple idea.

A person is rationally irrational to the extent that she exhibits epistemic irrationality because it is instrumentally rational for her to do so.

Instrumental rationality concerns performing actions that serve one's ends.

Epistemic rationality concerns processing information and arriving at beliefs via reliable thought processes.

We know people sometimes are epistemically irrational. Now, if we can identify instances where people are epistemically irrational and it is their self-interest to be so, then we have identified instances of rational irrationality.

None of this requires voluntarism about belief--you don't need to hold that people literally choose to believe or disbelieve. Instead, you just need to hold that people can choose to put more or less effort into thinking.

Again, just read the psychological literature on motivated reasoning, and you will learn actual psychological mechanisms by which rational irrationality takes place. I'm surprised there is even any debate at this point about whether such a thing exists.

fundamentalist writes:

Jason, that would depend upon your definition of rationality. I like economist Roger Miller's definition that I posted above. Using that definition, then a person can't knowingly make themselves better off by knowingly making themselves worse off. Now if you want to use irrational as a synonym for ignorance, then yes rational irrationality works, but what's the point? Why not say people are ignorant? But if irrationality and ignorance are the same thing, then what word do we use for people who do things that don't make any sense or violate the rules of logic?

Philo writes:

". . . there is a long tradition in philosophy and psychology, originating with Descartes, that assumes that belief formation is a two-step process: First people comprehend a proposition . . . and then freely decide whether to accept it as true." Descartes certainly assumes (correctly) that *understanding* a proposition is part of, but not the whole of, *believing* it: that it is quite possible to understand without believing (though one cannot believe without understanding). The relationship is logical--believing *entails* understanding, understanding *does not entail* believing. According to Gilbert, Descartes thinks the relationship is also temporal--that understanding occurs *before* believing. This interpretation is suggested by Descartes' taking belief-formation to be an *act of will*: the act must, presumably, have the already-understood proposition as its object. Normally, when one acts on an object, the object already exists. But if there may be exceptions, where the object comes into existence simultaneously with the action upon it, then Descartes is not committed to temporal succession.

"According to [Gilbert's Spinozistic] view, people initially accept as true every proposition they comprehend and then decide whether to 'unbelieve' it or not." This is an overstatement. What people initially do is not *believe* the proposition, but *feel some inclination* to believe it, an inclination which in some cases is overridden by a contrary inclination. Understanding might be described as "as-if believing"--as *simulating* belief in certain respects. This implies that the understander has a certain tendency to believe, but not that he actually believes.

By the way, the very concept of belief implies self-conscious, critical thinking, and so it applies only to human beings (so far as we know). Lower animals seem to operate without beliefs, relying exclusively on what in a human being would count as a mere tendency to believe. Some philosophers have called such states "seemings," in contrast with actual "beliefs."

To classify understanding as in itself a "seeming" is to embrace what one might call a "quasi-Spinozistic" view of understanding. It allows one to reject the false Spinozistic thesis that the understander *at first* *actually believes* the proposition (Wilson and Brekke refer to "the Spinozan procedure of initial acceptance"). Actually, one can disbelieve a proposition from the first moment of entertaining it, if one's initial disposition to disbelieve it is strong enough to overcome the understanding-seeming.

Jason Brennan writes:

Fundie:

I don't understand your question in light of my last comment--it seems that I gave you the common definitions of epistemic and instrumental rationality (which are two different forms of rationality covering two different things), and then explained how to coherently define rational irrationality in terms of instrumental and epistemic rationality.

So, again: X is rationally irrational if and only if being (to some degree) epistemically irrational promotes her ends (and is thus instrumentally rational).

Ignorance has to do with a lack of information and a lack of belief. Epistemic irrationality has to do with failures to process information and to form beliefs properly.

Comments for this entry have been closed
Return to top