Bryan Caplan  

What's Worth Overcoming?

Survey Pre-Test: Who In Govern... "The Jobs Americans Won't Do":...

What's so special about "overcoming bias"? Tyler's questioned Robin's obsession twice (here and here), mingling sensible observations with bizarre Dadaisms like:

If I were allowed to retitle Robin's blog (and I am not), I would call it "Reaping the Fruits of Bias."
"I see "overcoming laziness" or "overcoming fear" or even "overcoming inadequate love of Sichuan chili peppers" as often a more important problem than "overcoming bias."
A clear analysis of the value of overcoming bias begins with a simple axiom: If a person values exactly the right things and has exactly true beliefs, he will always do the right thing. A corollary: Wrong values and/or false beliefs are the cause of all wrong actions. These truisms suggest a series of questions.

1. Why talk about "overcoming bias," when the real problem is "overcoming error"?

Simple: A bias is an identifiable tendency to make certain kinds of error. Saying "fight error" is analogous to telling investors "buy low, sell high." Saying "fight bias" is analogous to specifying when prices tend to be low and when they tend to be high.

2. Assuming someone's values are not exactly right, does overcoming bias necessarily tend to improve his decisions?

Nope. If the Nazis had severely biased beliefs about how to commit mass murder, they would have been less effective, and done less wrong.

3. Assuming someone's beliefs are not exactly right, does changing his values in the right direction necessarily tend to improve his decisions?

Again, nope. For example, I've previously argued that given voters' beliefs about economics, crass voter selfishness would lead to better outcomes.

4. Forget "necessarily" - in the world as we now find it, do marginal improvements in values and beliefs normally lead to more righteous actions?

In The Myth of the Rational Voter I argue that given the political motivations of the typical voter in the modern Western world, less biased beliefs are very likely to lead to better policies. (Here's why; see also here). And while I maintain that greater political selfishness would partially mitigate the harm of economic illiteracy, there are many other value changes that, at the current margin, would lead to more righteous actions. Movement away from patriotism and piety would be a good start. Maybe Tyler is right to put laziness and fear on the list too, but I'm less sure about those - should the typical person really consume less leisure and take more risks?

Comments and Sharing

COMMENTS (11 to date)
Perry E. Metzger writes:

I have to confess that I'm less than impressed with Robin's "Overcoming Bias" blog. Generally I've found that most of the time, when probed, the ideas on it have fallen apart, at least to me.

Mostly I feel like the blog is really "replacing your biases with biases I (Robin) prefer."

For example, there was a discussion that touched at one point on the fact that measurements of fundamental constants in physics often have error bars that do not overlap with the error bars of the previous best measurement. I brought this up as an example of instances in which it is difficult to impossible to figure out your systematic errors and thus to reduce your biases. Robin insisted that clearly, the physicists are being dishonest with themselves, but I could never pin him down on anything prescriptive, and frankly after a little while I thought he had very little going in the discussion other than his own conviction that the people involved in such measurements must be being intellectually dishonest "somehow" without actually being able to say how they could stop doing that.

Anyway, to make a long story short, I stopped reading the blog after a while. It started seeming like a substantial waste of time.

Robin is a very smart guy, and I like him, but I think that a number of his ventures, including idea futures (which, on examination of the record of real political betting etc., don't seem to yield much interesting information), the "Overcoming Bias" blog and related papers, etc., seem to be much more show than content.

TGGP writes:

Values are unfalsifiable. How could right and wrong values be distinguished.

By the way, have you read Walter Block's review of your book? He was very critical.

Tyler Cowen writes:

Bias motivates. A less biased Bryan would be a less motivated Bryan. The same might be said for voters. One key question is whether we are talking about reducing bias as a kind of free lunch, or whether we must also be reducing the causes and consequences of bias as well. It's not so simple.

But I know I wouldn't want you, Bryan, to be any less biased than you are. You are another fruit which we reap from the tree of bias.

TGGP writes:

Tyler, is Bryan at a local maximum on his approval subjectively judged by Cowen vs bias function? Would you perhaps like him to be marginally more biased?

Tobbic writes:

A system of values aren't right and wrong (or better/worse). IMO, system of values describes which actions are right and which are wrong. A system of values can't be falsified by empirical experiment because there's no universal system of values to be observed in nature. Values are subjective, in other words, defined by the person.

When choosing an optimal action, I think system of values gives a set of constraints and preferences define the utility function.

You can have some preferences for systems of values like: generality, practicality and simplicity. However, you can use these only to choose the optimal system of values in a certain equivalence class of system of values.

I would argue that given any system of values and any set of preferences a person would want to have as truthful beliefs as possible. This is because ppl aren't actually interested on choosing the optimal action but the optimal outcome. Thus, overcoming bias as means to get attain more truthful beliefs is recommendable for any person.

And btw, i think overcoming bias is a superb blog ;).

TGGP writes:

I would argue that given any system of values and any set of preferences a person would want to have as truthful beliefs as possible.
What if they valued having incorrect beliefs?

Robin Hanson writes:

Perry, if you see a consistent pattern where your error bars are too narrow, my advice is to widen them.

Tyler, do we see any evidence whatsoever that less biased people are under-motivated?

Jody writes:

A corollary: Wrong values and/or false beliefs are the cause of all wrong actions

Whatever happened to the trembling hand?

In other words variance matters too.

Sandeep writes:

Robin, I'd say believing in less bias is alone a tremendous motivator.

Tobbic writes:

"What if they valued having incorrect beliefs?"

Yes, people can postulate any values. Any belief (incorrect or correct) can be postulated as a value in itself.

IMO, what's important are those beliefs according to which people act, not those which they say they believe. So, in which instances people are observed to constantly act out an incorrect belief (proven to be untrue)?

Obviously there are examples of this but I would argue most of such persons can be labeled as having "mental problems" (e.g. belief that sky will fall if I don't wave my hands in intricate patterns).

TGGP writes:

Bryan Caplan just wrote a book about how people prefer to believe things that aren't true.

Comments for this entry have been closed
Return to top