Bryan Caplan  

Society of Lies

Cowen on the Multiplier: Is it... Seattle Gun Buyback Backfires...
Robin Hanson recently inspired me to re-read Tolstoy's "The Death of Ivan Ilych."  In a just world, social scientists of all descriptions would analyze this great work from a hundred different angles.  On my latest reading, though, what struck me was Tolstoy's depiction of the role of lying in society:
What tormented Ivan Ilych most was the deception, the lie, which for some reason they all accepted, that he was not dying but was simply ill, and that he only need keep quiet and undergo a treatment and then something very good would result. He however knew that do what they would nothing would come of it, only still more agonizing suffering and death. This deception tortured him -- their not wishing to admit what they all knew and what he knew, but wanting to lie to him concerning his terrible condition, and wishing and forcing him to participate in that lie. Those lies -- lies enacted over him on the eve of his death and destined to degrade this awful, solemn act to the level of their visitings, their curtains, their sturgeon for dinner -- were a terrible agony for Ivan Ilych. And strangely enough, many times when they were going through their antics over him he had been within a hairbreadth of calling out to them: "Stop lying! You know and I know that I am dying. Then at least stop lying about it!" But he had never had the spirit to do it. The awful, terrible act of his dying was, he could see, reduced by those about him to the level of a casual, unpleasant, and almost indecorous incident (as if someone entered a drawing room defusing an unpleasant odour) and this was done by that very decorum which he had served all his life long. He saw that no one felt for him, because no one even wished to grasp his position.
Tolstoy's on to something big: Human beings face intense social pressure to lie.  Yet human beings also face intense social pressure not to speak total nonsense.  Imagine, then, that everyone else says X, but X seems false to you.  What should you infer? 

1. You're right and everyone else is wrong.  This view maximally gratifies your ego, but strains credulity.  You're the only person who sees the world clearly? 

2. Your doubts are nonsensical.  This view is maximally humiliating, and fairly plausible.

3. You're right and everyone else is lying.  This view is fairly gratifying for the ego, and fairly plausible too.

Which view should you adopt?  Tough call.  Defaulting to #3 give you too much credit.  But defaulting to #2 gives mankind too much credit.  After all, aren't people a pack of liars?

Your quandary gets worse once you realize that - abject conformists aside - everyone who thinks for himself is in the same epistemological boat as you are.  Everyone you meet might be silently wondering: Have I figured out what everyone else already knows but refuses to say?  Have I stumbled into risible error?  Or am I perchance the one-eyed man in the land of the blind?

There's one more factor, though, that makes our world extra confusing.  On reflection, virtually everything you know is based on trust in other people!  Life's too short to personally verify more than a sliver of facts.  Unless you actually replicate the experiments in your physics textbook, even your knowledge of "hard science" rests on your unproven - and often false - belief that big groups of people don't converge on shared lies.

If you take my concerns seriously, you could retreat into total skepticism.  You could retreat to the Cartesian view that you'll only believe facts you can directly check.  The more sensible response, though, is to audit your society - and prominent subgroups within.  Pick socially approved views at random, gather relevant facts you can personally verify, then measure the discrepancy between what everyone says and what you really know.  This exercise won't answer all your questions, but at least you'll know how reliable mankind - your ubiquitous informant - really is.

Not satisfied?  Here's something else you can do: When other people know nothing beyond "what everyone says," be slow to ridicule their doubts.  Under such circumstances, doubts aren't just defensible; they're a strong symptom of truth-seeking.

Global warming is a case in point.  The vast majority of people who believe in global warming have only one real piece of evidence: Climatologists believe in global warming.  In fact, most believers don't even have that.  All they really know is that many non-climatologists say that climatologists believe in global warming.  As far as most non-experts are concerned, the real issue is simply, "Are big groups of people lying?" 

If this sounds paranoid, recall the plight of Ivan Ilych.  Big groups of people often lie.  Maybe you have enough first-hand knowledge to say, "Not in this case."  Yet most people - even people who agree with you - lack such first-hand knowledge.  Ridiculing skeptics may make them shut up.  But when you do so, you're promoting not truth, but mere conformity. 

COMMENTS (20 to date)
Jake Shannon writes:

Solipsism is vastly underrated.

Jonny H writes:

"Whenever you find yourself on the side of the majority, it is time to pause and reflect."
--Mark Twain

RPLong writes:

The truth is hard work, but lying is always easy. I think people lie all the time, and mostly to themselves, to avoid having to think hard about anything.

And smart people are the worst, because they are always so sure that they have things figured out that they need not bother second-guessing their most deeply held assumptions.

I think Caplan is right on when he advises us to periodically audit society's reliability. Sage advice.

Mauricio writes:

So everyone can see the King's wardrobe... wait, what?

Tracy W writes:
Imagine, then, that everyone else says X, but X seems false to you. What should you infer?

Your explanation of resulting inferences misses out another couple of options:
4. Say that X is wrong, and see what the response is.

If the response is a bunch of insults with no substance, everyone is probably lying.
If the response has some substance (along with the insults), test the substance.

5. Think about something you're more likely to make progress on, like what to have for dinner.

Shane L writes:

"On reflection, virtually everything you know is based on trust in other people!"

Yes, well said. In this sense I've sometimes thought that science can function in a similar way to how religions function. People trust the scientist/priest, who issues statements that the people don't have the expertise to reject. Of course one could demand evidence from scientists and follow the chain of evidence back, but most of us don't.

I don't mean to denigrate science (or religion!), but I suspect some people who proudly reject religion as superstition are quite as uncritical about scientists as the faithful are about priests and ministers.

Ken B writes:

I'd consider it a great improvement if it were just ridicule. Instead we are regularly treated to accusations of racism, or comparsion to holocaust deniers.

I was asked once what I want on my tombstone. Since it was a rhetorical question, I gave a rhetorical answer:
Ken B 1850 - 2350 "Be more skeptical"

Daublin writes:

A point well worth raising, Bryan. I really like your "auditing" technique.

Here are a few ideas on effective audits:

1. Decide how you will interpret an answer before you ask a question, just like you would as a scientist doing an experiment. Sometimes you'll decide it's not a very helpful question. Other times it will firm you up if you get a surprising answer.

2. Probe more than once. One data point is weak evidence.

3. The more important you think the issue is, the more you should look into it. Shame on people who become bible-pounding missionaries for issues they didn't bother to inquire into.

Auditing like this has many benefits aside from finding the truth. It will also make you more humble and nicer to be around. You won't just be polite to contrary people out of a sense of human decency. You'll be nice to them because you've done some searching yourself, and you know first-hand how hard it is to find the truth.

The Sheep Nazi writes:

You might want to read a bit of Ernest Becker to go with your Tolstoy. IIRC there is a half a chapter on Ivan Illich in Denial of Death, which I think you might both enjoy and profit from. Becker's not so easy to sum up but here is one go: human beings mostly lie, and this is not surprising, considering what it is that they are up against.

R Richard Schweitzer writes:

This is a bit late and off thread:

Thank you for the reference to Michael Huemer's "The Problem of Political Authority."

I acquired it.

The only thing missing (so far) is the position of "representation" in governments.

Howard A. Landman writes:

"Unless you actually replicate the experiments in your physics textbook, even your knowledge of "hard science" rests on your unproven - and often false - belief that big groups of people don't converge on shared lies."

The problem in physics isn't so much that people believe theories that aren't true. It's that they have filters in place that keep them from seeing some theories that are true. The Aharonov-Bohm effect was an obvious, trivial consequence of Schrodinger's equation from 1926 onward; the SE requires potentials and cannot be correctly formulated in terms of fields alone. People kept trying to "fix" the SE, and failing. In 1949, Ehrenberg and Siday described the effect in detail; their paper was ignored. Ten years later (1959), Aharonov and Bohm rediscovered it, and people named it after them, as if it was something new that had never been seen before, something "surprising" and "unexpected". It remained controversial for decades, despite being a simple corollary of QM, until multiple separate groups confirmed it (most elegantly Tonomura's group at Hitachi Labs in 1986). That's 60 years for mainstream physics to accept a trivial consequence of QM, largely just because it violated what I call "naive gauge invariance", the idea that only fields are real. (Classical EM has this property, but the universe does not.)

guthrie writes:


There seems to me to be two different things going on here. Tolstoy's depiction is of a man dying and those intimates around him unwilling to deal with that harsh reality, seems somewhat out of synch, to me, with the societal adoption of certain grand concepts like ‘Global Warming’.

There are as many social pressures to be honest as there are to lie. As you mention, 'virtually everything you know is based on trust in other people', especially when it comes to transactions exchanging goods and services. It pays, by in large, to be honest.

But there are other transactions. Human beings are pecking-order creatures. We transact status in every encounter with another human being, no matter what else might be occurring. As social creatures we modify ourselves, and avoid acknowledging these transactions unless there is a dispute (‘how dare he outstare me like that’! ‘She is invading my space’!). If we didn’t, we would be in constant conflict. This can be construed as ‘lying’ because we’re not either conscious of, or unwilling to make others aware of, these transactions in status.

Additionally, we do what makes us feel safe. If that means avoiding a terrible realization (favorite uncle Ivan’s about to die!), that’s what we will do, damn the consequences. It may be a ‘lie’ to some (as in Ivan’s case, making him feel like a burden to his family), but ’coping’ to another.

When it comes to less critical issues such as ‘Global Warming’, you may be seeing more tribalism at work, then lying. ‘The Environment’ is used in certain circles as a signal for the in-group. There are inter-tribe status structures, but that doesn’t seem to be your point. You may want to find a different analogy for what you're trying to convey.

Michael writes:

Your choice of global warming might be a bad example, even if you're trying to specifically defend uninformed global warming doubters.

People may jump on that paragraph and ignore the rest.

Paul writes:

Ha! After reading that quote the issue of global warming immediately came to mind but for a different reason. I work in the fossil fuel industry. I have also worked on Capitol Hill. Although there are many true believers, many people I associate with privately think global warming is largely bunk, yet publicly act like they believe it to be true and that we should do something about it. We're all participating in a lie and we even do it to each others' faces even though we each know the other is lying. I feel like I have fallen down the rabbit hole where we are all required to believe "as many as six impossible things before breakfast."

Anthony writes:

Given that on many issues people lack the ability to evaluate direct evidence, I think the question of what strategy to apply in deciding who to listen to should get a lot more interest than it does. The degree to which this is the case varies person to person, but it's the case for all of us some of the time.

Philo writes:

”[E]veryone who thinks for himself” is, simply, everyone *tout court*. Even those who maximally depend on the advice and testimony of others need to judge whom to trust; trying to pass the buck of judgment off onto others would lead to an infinite regress. One ultimately bears responsibility for his own most fundamental judgments.

You yourself have made a study of most people’s—of society’s—beliefs in the field of economics, and found those beliefs seriously wanting. But you can’t do many more such studies—life is too short—and therefore you have only a slender basis for generalizing about the quality of society’s beliefs.

In the case of economics, people are probably simply mistaken; it is unlikely that they are consciously lying about their economic beliefs. In other cases widely professed beliefs may be quite insincere. In the interesting intermediate sort of case people are perpetrating upon themselves a significant quantity of unconscious or semi-conscious *self deception*, so that their professions of belief are not quite lies, but yet are not fully sincere. The degree of sincerity with which people hold their beliefs varies greatly. Practical beliefs that people act on, with consequences in the short run, are probably held more sincerely than merely theoretical beliefs.

Yes, managing our dependence on others in the formation of our opinions is a major epistemological challenge.

big mac writes:

Bryan, you make a great point!

This is a real phenomenon.

It is shocking what people will condition themselves to believe, even with personally impactful evidence clearly in front of them.

Take this eye witness account from one Argentine regarding the past 10+ years...

"As the press all over the world talks about the political success of the current administration, and mentions the “flourishing”, prosperous Argentina, a clear minority which I’m part of sees things differently. It makes you wonder and ask yourself a few other things as well. Who writes all these praises? What kind of data do they use to make such positive statements? How can a country be booming economically, yet keeps having shantytowns grow at an accelerating rate, poverty, misery and decadence never backing down one inch, and the 3rd greatest inflation in the planet as the icing on the cake? After reading some of the emails people sent me on the “success” of Argentina, I wonder if its just innocent stupidity, lack of professionalism or if there’s more to it than meets the eye and there are other intentions behind it....

...One can only wonder how can such an authoritarian leader earn so much public support? Wasn’t it bad enough when they controlled the media through an unconstitutional law, or what about our retirement funds begin stolen (nationalized) right in our faces?"

- Fernando Aguirre, Oct 24, 2011, "And so it ends for Argentina"

From the outside looking in, we can objectively generally agree with Fernando's observations, yet he reports that this is a minority view in his own country.

This seems to support the 3rd category.

big mac writes:

Meant to add that Fernando is not explicitly at #3, but he finds the media descriptions and testaments around him incredible to the point of a collaborative delusion.

Jim writes:

The marketplace is flooded by lies because the first reports of an event tend to give incomplete, or sometimes, downright wrong accounts. Take TARP as an example--most vilify it and characterize it as the program that emptied the public coffers to line the pockets of the rich. TARP actually MADE money for the public, yet, it lives in infamy because of how poorly the program was described in the media. Alan Blinder talks about this, and its harm to the government's ability to respond to future crisis in his new book After the Music Stopped. There's a good preview of it on the book's facebook page: or amazon page:

Would highly recommend it.

Joe M writes:

[Comment removed for supplying false email address. Email the to request restoring your comment privileges. A valid email address is required to post comments on EconLog and EconTalk.--Econlib Ed.]

Comments for this entry have been closed
Return to top