Bryan Caplan  

The Undermotivated Apostate

PRINT
Murphy and Smith on NGDP and p... What housing supply glut?...
People rarely revise their beliefs on issues they care about.  Even when confronted with strong counter-evidence, they usually manage to weasel out somehow.  When you encounter someone who has revised his beliefs, therefore, it's tempting to conclude that he's highly reasonable.  Apostates - people who abandoned a whole belief structure - smugly feed this temptation: "When the facts change, I change my mind.  What do you do, sir?"  As a serial apostate, I've often smugly fed this temptation myself.

When I listen to apostates, however, I'm usually struck by the flimsiness of their deconversion stories.  Why exactly did they change their minds?  A reasonable apostate would go through a process like:

1. I used to believe X, where X is something that at least sounds vaguely plausible.

2. But then I noticed a non-obvious but telling intellectual flaw in X.

3. I approached the best minds who believe X with my doubts, but none of them had a good response.

4. So I stopped believing X.

In practice, many apostasy stories discuss people rather than ideas: I had a falling-out with my fellow believers, so I stopped agreeing with them.  But even the idea-centric stories sound more like:

1. I used to believe X, where X is something that sounds silly.

2. But then I noticed an obvious and telling intellectual flaw in X.

3. I ignored the flaw for a while.

4. Then I finally woke up and stopped believing X.

My point here is not that people shouldn't change their minds.  They totally should.  My point, rather, is that human irrationality is even more prevalent than it seems.  Most people are too irrational to change their minds on anything important.  But most people who change their minds on important issues nevertheless do so irrationally.




COMMENTS (5 to date)
Philo writes:

Is this really your conclusion: "But most people who change their minds on important issues do so irrationally"? I thought your point was that most people who change their minds on important issues don't deserve much credit, because what they used to believe was so absurd. In such a case it is rational to change your mind, but it was grossly irrational to be in that state of mind to begin with.

We're biological animals molded by evolutionary forces to behave in ways that enhance our fitness (or more precisely the fitness of our genes). Human language and reason are not magical transcendent forces bestowed by god (unless you believe in that kind of thing), but products of our evolution.

Moral attitudes are guided by fitness enhancement: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4240999/

And reason evolved to serve our fitness interests, not arrive at truth: https://hal.archives-ouvertes.fr/hal-00904097/document

In light of all that, Bryan, your observations about conversions are predictable and inevitable.

BZ writes:

"Most people are too irrational to change their minds on anything important."

How do they view what is/isn't important?

Could it be that most beliefs about things without significant marginal personal consequences are matters of personal identity, and their expressions group identity signaling?

In which case it is perfectly rational to believe all kinds of stupid things precisely because they are personally insignificant. Changing those beliefs for group-identity reasons would likewise be rational.

Shane L writes:

The well-known zeal of the convert suggests an irrational element for some too. A change in opinion may coincide with a change in identity, which hinders clear-minded judgement of ideas from fellow members of one's new ideological tribe.

Perhaps several serious changes of mind could lead one towards a cautious attitude closer to agnosticism, knowing that one is likely to change opinion in the future again.

Thomas Sewell writes:

Yeah, my personal optimization is to reserve judgement on things until I actually have pretty good evidence about them.

You don't actually have to have an opinion on most things, so unless I've actually researched it or must make a decision right-away, I prefer not to.

That makes it much easier to just update what I know about an issue as new information comes in. I'm always happy to hear new evidence about something I actually have an opinion on, but in that case I've also already put in significant effort to have the "right" opinion, so that sets the bar pretty high for what may cause me to re-evaluate, because most people I'd seriously debate something with haven't bothered to learn much about an issue or considered other perspectives already.

Comments for this entry have been closed
Return to top