Arnold Kling  

Argumentative Theory

PRINT
Two Points on Kids and Happine... Richard Ely, Racist and State ...

says


having a confirmation bias makes complete sense. When you're trying to convince someone, you don't want to find arguments for the other side, you want to find arguments for your side. And that's what the confirmation bias helps you do.

The point is that our reasoning ability evolved in order to make us persuasive and also difficult to persuade. It did not evolve in order to find truth.

This confirms the bias of many Masonomists. Have Robin Hanson and Tyler Cowen already blogged on it?

The proponents of the theory claim that there is a solution.


when people are able to discuss their ideas with other people who disagree with them, then the confirmation biases of the different participants will balance each other out, and the group will be able to focus on the best solution. Thus, reasoning works much better in groups. When people reason on their own, it's very likely that they are going to go down a wrong path. But when they're actually able to reason together, they are much more likely to reach a correct solution.

Of course, assuming that the goal of group discussion is to reason together may be no better than assuming that the goal of individual reasoning is to arrive at the truth. There certainly are other things going on in a group context, including signals of affiliation, signals of dominance, and so on.


Comments and Sharing





COMMENTS (16 to date)
Shangwen writes:

Don Taylor also blgged on it here.

Mike Kenny writes:

i actually wrote a while back in the comments section of robin hanson's blog:

this seems related to how a rationality in a person is affected by their relation to others. i sometimes wonder if it’s worth thinking about individual bias as a part of a bigger picture, how groups function–maybe it’s good to be biased in some contexts, because it serves some good–i think of lawyers who are biased for their client, but the overall trial process is one in which we tend to think the results are probably more unbiased than if you had one person reasoning things out, because the competing arguments target certain flaws in the opposing view.

perhaps it would be profitable to think about how our rational processes work in the stone age environment, in typical situations when we would employ rationality and arguments. perhaps we might develop a ’stone age rationality’ idea from this, if there hasn’t already been such an idea–how to work with our rational natures rather than against them in our novel modern environment. sometimes i feel like trying to overcome some biases feels a bit like forcing myself to walk on all fours–just unnatural. not to say overcoming bias isn’t worth doing–i just wonder what ways work best–i think of the metaphor of someone trying a conventional diet versus a paleo diet, and perhaps finding the paleo diet easier, because it works with his nature.

Warren Winter then replied:

You would enjoy reading Hugo Mercier and Dan Sperber’s work on the social function of reasoning. Some links to publications are on this page: http://hugo.mercier.googlepages.com/theargumentativetheoryofreasoning
Joseph K writes:

I agree that their solution doesn't work so well with normal group dynamics. Dominant persons tend to push the group to agreeing with them and every one else tends to fall in line. Fortunately, this only happens in person. When people are lobbing arguments back at forth in the blogsphere or the pages of a scholarly journal, they're less constrained by these social pressures.

Cahal writes:

Interesting.

Personally, I tend to get on better by myself when I'm doing work, so I'm not so sure about the group thing. Having said that, I'm talking about maths which is simply not debatable, so confirmation bias doesn't really play a role.

D. F. Linton writes:

This is of course proven by the Einstein Committee's General Theory of Relativity...oh wait, never mind.

John writes:
The point is that our reasoning ability evolved in order to make us persuasive and also difficult to persuade. It did not evolve in order to find truth.

This seems to be a rather extreme claim to make. I could perhaps understand that our rhetoric evolved to persuade but our reason was evolving before our language developed so far. Since our survival would be dependent on reasoning leading to a truth, not just my favored position regardless of it's truth status related to the external world.

I can understand that "truth" might not be a unique proposition for every argument so the rhetorical aspects will matter but they cannot be devoid of some basis in truth -- except in purely academic cases where truth doesn't matter and it's only about the underlying logic.

Yancey Ward writes:

The entire idea seems to have been developed by some egghead working alone.

Lee Kelly writes:

Karl Popper once argued that a little dogmatism could be good for science, so long as the methods, norms, and institutions of science did not share that dogmatism. Basically, hypotheses often need passionate defenders if their refutation is to be thorough. In fact, all this argumentative theory stuff was more or less in Popper. His whole anti-justificationist slant to philosophy is a response to this type of thinking.

Tom West writes:

I doubt the evidence has ever been (or can be) collected, but at a guess, I'd say that societies that consistently believe a single lie will usually annihilate societies that embrace a wide diversity of opinions, many of which are closer to the truth.

My experience with life in general (with occasional, but important exceptions) is that the advantage to working groups of knowing the 'truth' is usually smaller than the advantage of everybody working together. Group affiliation "I'll stand with you because you're one of us" tends to make people more effective and happier that simple rationality "I'll stand behind you until such time as I believe you are no longer holding what I view as the truth".

I suspect "truth-seeking" is what many of us do because we have to, not because the outcomes are better for ourselves or those around us.

Certainly if you're pushing a particular policy, it's often the case that the "truth-seekers" who support your position are far more dangerous to your success than those who outright oppose your policy.

Philo writes:

Haidt says, "Reasoning was not designed to pursue the truth"; but that can't be more than a half-truth. Having true beliefs is too valuable not to form part of the evolutionary "design." On the other hand, winning arguments, while often valuable, may just as well be disvaluable--when the course of action for which you are arguing only seems attractive to you *because of your false beliefs*.

Note that a certain amount of "confirmation bias" must be inherent to our psyches. As skeptics have pointed out for millennia, we have no *reason* to accept the basic reliability of the belief-forming processes with which we are naturally equipped. We just have to accept, on what Santayana called "animal faith," that our beliefs are broadly correct.

Philo writes:

I am reminded of one of Burke's most-repeated sentences: "We are afraid to put men to live and trade each on his own private stock of reason; because we suspect that this stock in each man is small, and that the individuals would do better to avail themselves of the general bank and capital of nations and of ages." The Sperber amendment would be to substitute 'misdirected' for 'small': the individual's reason works poorly at achieving truth not because it is weak but because it is aimed elsewhere. But society as a whole amalgamates these misdirected individual efforts in a way more likely to achieve truth.

The point also lends some support to the Efficient Market Hypothesis, that the "opinion" of the market as a whole is superior to that of any individual component.

Tom West writes:

Having true beliefs is too valuable not to form part of the evolutionary "design."

Any evidence of that? Sure, for completely obvious in-your-face truths, perhaps. But most truths either:

(1) don't mandate any radical change in behaviour (e.g. moon-landings are a hoax)

(2) mandate behaviour that when applied as a whole make for *less* chance of survival (e.g. God will punish you for bad things even if you could 'get away with it')

or

(3) are gray enough that being somewhat close is 'good enough'. (i.e. modifying P a bit about some cause and effect relationship)

Contrast this with the *strong* advantage of group solidarity.

Now, of course, counter-examples where truth is actually useful are rife, but I think even a cursory look at the evidence will indicate that in the most cases, truth is only a marginal factor in how well groups prosper.

John writes:

Tom W. makes a very strong point about the benefits of groups working together. I certainly agree that over a large range of scenarios a group that works together well with a shared goal will do better than a poorly organized group with a grudgingly shared goal.

I don't think that quite closes the discussion though. Consider two hypothetical evolutionary paths. Group A follow a path of reason for persuasion with no regard for underlying fact or truth -- it's pure logic. Group B follows a path where fact and truth matter and persuasion is conditional on fact-finding and true conclusions not merely valid conclusions.

Which evolutionary path survives?

My money is on the second group.

Tom West writes:

John, when you talk about group A, you mention pure logic. But persuading human beings is often anything but logic. It's often appeal to a variety of emotions and insecurities. Could you clarify?

Also, remember that basic truths are going to be undisputed in both groups, so we're only looking at more abstract, harder to discern truths.

To give the executive summary of my position, much of group survival in the past is a prisoner's dilemma. Putting yourself in peril for the group is not personally wise, but strategically necessary. Evolution has wired us so that we *can* be persuaded to do the logically stupid to ensure group survival.

In your example, I think group B never invents religion which means they never invent civilization.

John writes:

Tom, I was thinking of the difference between valid logic and true conclusions.

You're correct that quite often many people are persuaded by invalid argument but argument that makes strong appeals to their emotions.

I'm not sure that religion is illogical. I think it can be seen as an attempt to explain the external world when little to nothing is known about the nature of the external world. It's something of an extrapolation of empirical observation -- at least once we're at the stage of doing anything that's creative and manipulates what nature provides (tools, huts, clothing).

Similarly, I see some take what are assumptions and hypothesis in science as it they were fact and defend that position as strongly as deeply religious people do.

Interesting thought though -- could civilization been avoided by man?

perfectlyGoodInk writes:

Thus, reasoning works much better in groups.

More specifically, within diverse groups. You can get groupthink otherwise.

Even a diverse group is unlikely to have their confirmation biases exactly balanced, and confirmation bias itself is unlikely to cause the people who disagree to really truly listen to each other. The winning side is more likely to have the most persistence or, as Joseph K points out, the people with the most dominant personalities than it is to be standing on the side of truth.

Personally, I see confirmation bias as further evidence for incompetent design.

Comments for this entry have been closed
Return to top