Arnold Kling  

Avoiding Truth

PRINT
Schooling and Health... The Pigou Club's Powerful Memb...

My latest essay says,


The great mass of people form their political beliefs with little regard for facts or logic. However, the elites also have a strategy for avoiding truth. Elites form their political beliefs dogmatically, using their cleverness to organize facts to fit preconceived prejudices. The masses' strategy for avoiding truth is to make a low investment in understanding; the elites' strategy is to make a large investment in selectively choosing which facts and arguments to emphasize or ignore.

What I call the high-investment strategy for avoiding truth is a phenomenon that interest me a great deal. After I assimilated this idea (from Jeffrey Friedman's piece in Critical Review, I started seeing example after example of people (sometimes myself included) following the high-investment strategy.

For me, this is an important phenomenon, and that makes this an important essay. Read the whole thing and comment.


Comments and Sharing


CATEGORIES: Political Economy



COMMENTS (12 to date)
mjh writes:

Re: incoherent behavior, the article says:

Most voters lack elementary knowledge of our political system, they hold views that are ideologically jumbled and logically inconsistent, and their opinions change over time in ways that suggest almost random behavior.
My political views have definately changed over time. Should I attribute this to incoherance? I typically attribute it to having learned more about where my previous political views were lacking. What is a reasonable way to discern which explanation is best?

Arnold Kling writes:

mjh,
If your views change back and forth, without your obtaining new information or analysis, then that would be incoherent

mjh writes:
If your views change back and forth, without your obtaining new information or analysis, then that would be incoherent
Well that makes me feel good, but I don't think it really answers the question, because I'm certain that everyone, including those whose views appear externally incoherant, could take comfort from that statement. For example, if today, I listened to Rush Limbaugh, I have obtained new information and analysis which may sway my view. But tomorrow, I may listen to Air America, and again I have obtained new information and analysis which may sway my view. Not many people are willing to listen to both ends of that radio spectrum. But I think that it's easily possible that I'm swayed by external inputs in ways that make my beliefs appear incoherent from the outside.

I'd like to think that the difference is that I am able to find issues for which I can disagree with people who mostly hold similar views. I'd like to think that my views are coherant because I take a reasonable effort to understand the counter arguments that are made. But I'm sure that I'm swayed more frequently by my biases and much more likely to become a "motivated skeptic".

My personal goal is to figure out what's true, and stand by it. I lean in a particular direction today because I think that direction is closer to the truth. But in 5 years, I may discover that it was all an illusion, and may lean in a different direction. If that appears incoherant to external viewers, I don't really care. But it makes me wonder how incoherant the "low investment" crowd actually is in comparison to how incoherant they appear.

Arnold Kling writes:

perhaps you should try to obtain a copy of the latest issue of Critical Review. I think you will come away from reading it more comfortable with the distinction between incoherent views and a rational person coming to terms with new information

Matthew Cromer writes:

One of your best essays, Arnold. And you are right on the button about how elites fool themselves through a selective and filtered reading of the data. . .

conchis writes:

Arnold,

Support for the idea that elites can follow a high investment strategy to avoid truth can also be found in Philip Tetlock's "Expert Political Judgement" - where he documents that in people with a "high need for closure", expertise can worsen judgement and predictive ability by making easier to justify one's pet theories.

The other interesting thing to come out of this though, is that not everybody does this, and that tests of cognitive style appear to be able to predict who will and won't suffer on some scores as a result of greater expertise.

jaim klein writes:

If I understood the gist of Arnold's idea (which in Arnold's view would be a small miracle, since everybody is running away from truth and understanding), there is a parallel between markets and politics, in the sense that both are systems engaged in processing information. Markets process information and express the result in, say, a price, which is true, but political systems process information but their final outcome is a lie.

Fishing out individual samples of information being processed in the political system, say a drunken incoherent statement in a bar or a fine elaborate op-ed in the New York Time, we find that the factual content is either grossly ignorant (bar) or cleverly tendentious and false (the NYT). In all cases there is a clear effort to avoid the truth.

I think Arnold is comparing two different things: price vs information (truth) in the political process. In my opinion, the equivalent of price in the market, is power in the political system. If the outcome of the political process is that Bush sits in the Oval Office and commands four million armed warriors, while Gore writes about climatology, then that outcome is true. Bush commanding the Army is true in the same order of magnitude as the 62 US$/barrel oil quotation.

What I want to say is that the truth content of political statements is irrelevant, meaningless. They are only intermediate products towards the end product, which is who gets the power. Allow me please to quote my favorite politician, General Juan Domingo Peron: "La unica verdad es la realidad" (the only truth is reality) which I interpret as: Everything is lies and nonsense, the only truth is who gets the power.

Regarding the question if the political system can process information efficiently when, by sampling every kind of information and statement and article within it, we find that there is not a grain of truth in anything, I think that the answer is yes, the final outcome is true and represents the actual relation of forces in the field. Bush was stronger than Gore, not because that is the outcome of the process, but because he was stronger before and the true outcome was Bush giving orders to the American Army and its generals. A different outcome would have been, I believe, a lie. Elective systems, as a rule, produce true outcomes.

If the outcome of the political process is that Bush sits in the Oval Office and commands four million armed warriors, while Gore writes about climatology, then that outcome is true. Bush commanding the Army is true in the same order of magnitude as the 62 US$/barrel oil quotation.

That's not Arnold's point, but you've inadvertently illustrated what is: You're factually wrong in two things; oil is under $55 a bb as I write this, and we haven't had 4 million armed warriors since WWII.

LBJ and Nixon had 3.5 million military to fight Vietnam. Reagan had 2.4 million men and women in uniform to face down the Soviet Union. Bush I cut that number to 1.7 million, and Clinton further downsized that and bequeathed GW Bush a 1.4 million force. Only a fraction of which is combat ready.

While the above may seem to be nitpicking, in the case of oil prices, we know exactly what they are everytime we fill up the gas tank. And we adjust our behavior accordingly.

In the case of the number of 'armed warriors', almost no one knows the truth. Even intelligent and educated people like yourself. Much less do we adjust our behavior to that reality.

Btw, for the rigorous version of Arnold's argument see Tom Sowell's Knowledge and Decisions.

http://www.amazon.com/Knowledge-Decisions-Thomas-Sowell/dp/0465037380/sr=1-1/qid=1168020928/ref=pd_bbs_sr_1/002-0392460-3947203?ie=UTF8&s=books

daveinboca writes:

Arnold Kling has a libertarian take on the fundamental hypothesis that only about 10% of the population invest any real energy in politics. And of that decimal, each of the paired-off opponents tend to give their own predilections an overwhelming bias in sorting out new information. As Kling succinctly sums up:

The masses' strategy for avoiding truth is to make a low investment in understanding; the elites' strategy is to make a large investment in selectively choosing which facts and arguments to emphasize or ignore.

So we have Matthew Arnold's "ignorant armies" on a "darkling plain." Or do we?
I believe in democracy because I distrust the elites. I distrust the elites because I believe that self-deception is widespread, and the elites are particularly skilled at it. Accordingly, I believe that it is important for those in power to have the humility of knowing that they may be voted out of office.

Others believe in democracy because they are hoping to see the triumph of a particular elite. Many liberals want to see sympathetic technocrats manipulating the levers of government, nominally for the greater good. I see government technocrats as inevitably embedded in a political system that inefficiently processes information. The more they attempt, the more damage they are likely to do.[MY EMPHASIS] Many conservatives want to see government used for "conservative ends." However, I believe that the more that government tries to correct the flaws of families, the more flawed families will become.
"That government governs best which governs least," said Honest Abe, before the onset of the colossal catastrophe which post-war American government has become. I thank God daily for Medicare, but a single-payer system would be Iraq times ten, a catastrophe which would make Canadian wait-times move from months to years, just to take one example.

I worked for the US government for over a decade, and dealt with it in one way or another for three on a working basis. The bigger government is, the worse it operates. When I was a Beltway Bandito working with Booz Allen Hamilton, I learned of an unpublished study concerning the Pentagon which had been sponsored and paid for by the military itself. It concerned how best to cut back the size of the military bureaucracy. [This was in the early days of the Reagan presidency.] My fellow consultant-informant told me that the study had suggested that ANY WAY that personnel would be cut back would be preferable to the overpopulation of military technocrats and bureaucrats now functioning in the DC area. Indeed, the suggestion was made in the study [never published] that in the interests of efficiency and economy, it would be better if one out of every three names, regardless of rank or position, would be randomly selected out of the Pentagon phonebook and dismissed from their job than it would be to let the bloated payroll/personnel size be maintained at previous levels. This was obviously meant to make a point and not to be carried out, but the study never saw the light of day. I wonder why and who killed it?

Urban legend?

"And we are here as on a darkling plain Swept with confused alarms of struggle and flight, Where ignorant armies clash by night."
Welcome to the post post-modern world.
agent00yak writes:

While I don't doubt bias exists or that democracy doesn't work, I think the conclusion of bias taken from the second paragraph of the opening quote is incorrect. Affirmative action and gun control deal with issues where conflicting values can matter more than conflicting evidence. If the evidence of affirmative action is that on the margin it reduces inequality but decreases meritocracy, then of course both groups will leave with strengthened perceptions - especially the sophisticated people who are acutely aware of their values.

For reference:
"But what about ordinary citizens?...On reading a balanced set of pro and con arguments about affirmative action or gun control, we find that rather than moderating or simply maintaining their original attitudes, citizens - especially those who feel the strongest about the issue and are the most sophisticated - strengthen their attitudes in ways not warranted by the evidence."

Andrew M writes:

Two remarks on Arnold's opening quotation.

(i) It may not be irrational to give views one agrees with an easier time than views one disagrees with. Critical attention is a scarce resource. In light of that a sensible cognitive policy for an individual may be (A) to adopt any new belief we encounter from a prima facie plausible source, so long as it coheres with our present views; but when we encounter a belief that's inconsistent with our present views, (B) to devote cognitive resources to trying to eliminate the troublesome inconsistency, the obvious place to start being the new belief.

(ii) Individual psychological biases (if that's indeed what they are) may be less important to good cognitive outcomes than is often thought. Given a community of inquirers,it may turn out that every view gets sceptical attention--from someone; it doesn't necessarily matter if nobody gives his or her own views sceptical attention.

Andrew M

jaim klein writes:

The quality of the outcome of any decision making system is function of the quality of the information being processed. When the Soviet system degenerated and economic statistics became unlinked from reality, the whole structure collapsed. When a political system is based on public discourse with low truth content (disinformation and propaganda), the quality of the outcome may suffer.

Comments for this entry have been closed
Return to top