I disagreed with most of what Tyler Cowen said in his recent interview with Ezra Klein, but this part launched a vociferous internal monologue:

Ezra Klein

The rationality community.

Tyler Cowen

Well, tell me a little more what you mean. You mean Eliezer Yudkowsky?

Ezra Klein

Yeah, I mean Less Wrong, Slate Star Codex. Julia Galef,
Robin Hanson. Sometimes Bryan Caplan is grouped in here. The community
of people who are frontloading ideas like signaling, cognitive biases,
etc.

Tyler Cowen

Well, I enjoy all those sources, and I read them. That’s
obviously a kind of endorsement. But I would approve of them much more
if they called themselves the irrationality community. Because it is
just another kind of religion. A different set of ethoses. And there’s
nothing wrong with that, but the notion that this is, like, the true,
objective vantage point I find highly objectionable. And that pops up in
some of those people more than others. But I think it needs to be
realized it’s an extremely culturally specific way of viewing the world,
and that’s one of the main things travel can teach you.

Here’s how I would have responded:


The rationality community is one of the brightest lights in the modern intellectual firmament.  Its fundamentals – applied Bayesianism and hyper-awareness of psychological bias – provide the one true, objective vantage point.  It’s not “just another kind of religion”; it’s a self-conscious effort to root out the epistemic corruption that religion exemplifies (though hardly monopolizes).  On average, these methods pay off: The rationality community’s views are more likely to be true than any other community I know of.

Unfortunately, the community has two big blind spots. 

The first is consequentialist (or more specifically utilitarian) ethics.  This view is vulnerable to many well-known, devastating counter-examples.  But most people in the rationality community hastily and dogmatically reject them.  Why?  I say it’s aesthetic: One-sentence, algorithmic theories have great appeal to logical minds, even when they fit reality very poorly.

The second blind spot is credulous openness to what I call “sci-fi” scenarios.  Claims about brain emulations, singularities, living in a simulation, hostile AI, and so on are all classic “extraordinary claims requiring extraordinary evidence.”  Yes, weird, unprecedented things occasionally happen.  But we should assign microscopic prior probabilities to the idea that any of these specific weird, unprecedented things will happen.  Strangely, though, many people in the rationality community treat them as serious possibilities, or even likely outcomes.  Why?  Again, I say it’s aesthetic.  Carefully constructed sci-fi scenarios have great appeal to logical minds, even when there’s no sign they’re more than science-flavored fantasy.

P.S. Ezra’s list omits the rationality community’s greatest and most epistemically scrupulous mind: Philip Tetlock.  If you want to see all the strengths of the rationality community with none of its weaknesses, read Superforecasting and be enlightened.

P.S. By “extraordinary” I just mean “far beyond ordinary experience.”  People who take sci-fi scenarios seriously may find this category hopelessly vague, but it’s clear enough to me.