Bryan Caplan  

When Is Uncertainty an Argument for Inaction?

How the Economy of Anarchist S... Bob Murphy on EMH...
From a naive point of view, uncertainty clearly tips the scales against costly action.  If you're only 50% sure that your transmission if broken, for example, you have less reason to replace it than if you are 100% sure that it's broken.  You could object, "If you don't act now, it may be too late"; but this argument is still weaker than, "If you don't act now, it will definitely be too late."

Global warming skeptics often appeal to this intuition.  The less sure we are that we actually have a problem, the weaker the case for action.  As far as I can tell, their argument is completely solid.

Nevertheless, Tyler argued back in 2006 that uncertainty about global warming is an argument in favor of doing something about it:
Like Arnold Kling, I do not much trust climate models.  Perhaps I have spent too much time doing macro, and the experience carries over.  Nonetheless uncertainty about final effects gives us more to worry about, not less.  It is the worst-case scenarios for global warming which worry me, not the middling scenarios.  Variance is our enemy in this matter.
Brad DeLong approved:
As Tyler Cowen said a while ago, uncertainty in the accuracy of climate models is not our friend and not an argument for inaction.
What's going on?  If you read Tyler and Brad's words carefully, their argument is also completely solid.  The problem is an equivocation on "uncertainty," with each side using the meaning that supports its case.

On the first meaning, "uncertainty" means "uncertainty that a problem exists"  The higher the uncertainty, the lower the expected severity.  If this is what you mean by uncertainty, then uncertainty about global warming (or anything else) is a valid argument for inaction.

On the second meaning, "uncertainty" means "variance around a problem's expected severity."  The higher the uncertainty, the greater the variance.  That's why Tyler explicitly said "variance," and Brad talked about "accuracy."  If this is what you mean by uncertainty, then uncertainty about global warming (or anything else) is a valid argument for action.

In the past, I've affirmed my willingness to defer to the expect consensus on global warming.  Lately, I've been disturbed by hard evidence of the experts' ideological biases.  Unfortunately, this means that while I'd like to ask some climate experts to specify which kind of uncertainty they're talking about, I no longer know who to turn to for an objective explanation.  Got any pointers?

Comments and Sharing

COMMENTS (23 to date)
John Thacker writes:

The issue isn't uncertainty about whether a problem exists or not.

The issue is that their argument of uncertainty only makes sense in a one-dimensional problem. (There's a secondary issue of ignoring the possibility that a partial solution is worth than nothing.)

If you only have one catastrophic problem, global warming, then you want to act according to what your expected value is, even if there's a fair probability that no problem exists. (The exception as above would be if you think that there's a fair probability of a new Ice Age as well as global warming, so all that CO2 might save us from an Ice Age and reducing it could be a Bad Thing.)

However, once you admit multiple possible catastrophes, uncertainty can tilt your preferred solution towards something more flexible.

For example, if we have the possibility of both global warming and asteroid collision catastrophes, then uncertainty may reasonably tip the balance in favor of "getting wealthier" and "advancing science and technology" and other strategies that could help us adapt to either catastrophe, rather than placing all our eggs in the most likely basket.

ed writes:

I too am impressed by Weitzman-like arguments about uncertainty.

I just wish more people would have more Weitzman-like concerns about our government's fiscal position. I think a sudden collapse of confidence in US treasury securities is a more serious risk right now than global warming. We really don't know how likely it is or how bad it would be, but it is not comforting to think about. I would think the costs could easily be worse than even the worst case scenarios from the IPCC or the Stern report, and it could happen within our lifetimes.

Prakhar Goel writes:

Perhaps this article will help:

jlr writes:

Well, if I perspective as one of those pesky environmentalists is slightly different than most who walk this blog's halls. Simply, as change is introduced into a system in relative balance that system must accommodate that change. I believe (one can only believe in things, truth is elusive) that continuing to emit ghg's and other fun things like particultate matter, and NOx and SOX, pushes our system (Earth) to accommodate more and more emissions. Local, regional, and worldwide negative effects of pollution are easily observable in the degradation of the quality of the food we eat, the water we drink, and the air we breathe. So, it seems to me that the uncertainty is not whether the things we do affect the world but how and to what degree those accommodations will manifest.

One of my major complaints when people attempt to look at efforts to reduce emissions through the lens of economics is the blind eye that is cast towards the negative externalities of those less controversial emissions such as particulate matter, mercury, PCBs, etc... Just because the tools/data/models are not well-enough developed to confidently quantify the externalities does not mean that significant costs are not apparent and do not exist.

I do wonder if many have lost sight of the actual definition and intent of the word and idea conservative.

Rafal Smigrodzki writes:

try Coyote Blog:

This is a longish presentation but it does touch on the assumptions about the positive feedback that is crucial to the catastrophic (>10 C) climate change. Briefly, if climate was a system at the mercy of powerful positive feedbacks that favor warming over present temperatures, catastrophic warming would have most likely happened before (e.g. during the medieval warm period, or during any of the hundreds of warming periods that occurred in the last 30 million years). Although catastrophic cooling (i.e. ice ages) has happened many times, with strong positive feedbacks switching from current-level temps to freezing, the opposite, switching from current temps to massively increased temps, has not been observed in the record, ever. In other words, you can have tipping points between today's temps and glaciation, and vice versa, but not between today's temps and a hothouse. And, of course, current and projected CO2 levels are not unprecedented in history, so you can't claim that we are in danger of exceeding the "design envelope" of the system.

stanfo writes:

I would add an additional uncertainty.

If we take "global warming is happening" as given, what is the probability that we can reverse it? And supposing that we can reverse it, what's the probability that anything we do will reverse it? We have many nested conditional probabilities.

Snorri Godhi writes:

On the second meaning, "uncertainty" means "variance around a problem's expected severity."

This definition is shaky as it includes the first meaning as a special case. Perhaps what you have in mind is better described as the distinction between uncertainties in the 2 sides of the expected mean. The variance (which incidentally might be infinite, as Nassim Taleb helpfully reminded us) is of course the same on both sides of the mean; but if the distribution is skewed, then, even if the problem may not exist at all with probability of 60%, there may still be a probability of 10% that the problem is more than twice as bad as expected. (I am just making up the numbers, of course.)

Needless to say, at this point in time I would greet with a healthy dose of skepticism any scientific "consensus" that there is, in fact, such a long tail.

Sonic Charmer writes:

Even if one accepts the 'variance' form of uncertainty, this is still not an argument for action. It may be an argument for considering action, but a genuine argument for action has to take costs/benefits and tradeoffs into account.

We may decide that indeed the climate-change scenarios include significant tail risk, that this tail would be severe and bad for us, and that we should therefore consider trying to actively avoid that tail. From these considerations alone however it would not necessarily follow that any 'action' (i.e. limiting CO2 or whatever) was warranted. 'Doing nothing' (and adapting to specific issues as/when the time comes) may be the cheaper option, regardless of how one defines uncertainty.

Doc Merlin writes:

I recommend you run from this debate as fast as possible, Bryan. It stopped being meaningful and became political a lot time ago.

Its ceased to be about truth and has become about power and control over every single human action.

What you can eat.
What you can drive.
What you can wear.
What you can make.
etc etc.
In the words of someone far wiser and sillier than I:

"`The question is,' said Humpty Dumpty, `which is to be master -- that's all.'"

stephen writes:

This is a good article in Skeptic on the two types of uncertainty:

Eric Falkenstien makes some good comments here:

In my experience, the skeptics I listen to always focus on physical uncertainty, real uncertainty, not the little error bars around the models estimate. Yet it is always interpreted, by the believers, as "the standard error in the model is large". There must be a failure in communication here. How else do you get this wonderful three part argument:

1) build model with strong assumptions.

2) if model is realistic, take action.

3) if model is unrealistic, take action.

Hume writes:

I'm pretty sure Posner and Becker blogged about this a few years ago.

stephen writes:

Brian, this is what you are looking for:

Hume writes:

Here is becker-posner (although not exactly on topic):

noahpoah writes:

I came to post the link that Stephen posted (i.e., this). It seems to me that the utility of the Cowen/DeLong invocation of uncertainty as variance depends on the source of the uncertainty. If the model and all its assumptions are right and it produces a large variance, then (maybe) that line of thinking holds water.

Mike Rulle writes:

I know you are asking a general question, applied to this topic. Yet somehow it is this topic which generates this question. This makes me skeptical of the relevance of the question to begin with, at least as it relates to public policy per se.

I am with Doc Merlin on this one. We have taken this particular issue seriously precisely because of its "apocalypse soon" narrative. But its like Pascal's Wager. If all we need to do is create some kind of "infinity" on one side of an expected value choice--(infinity in this case is environmental apocalypse)---this will become the strategy of choice for all "science" competing for attention and funds---regardless of how one thinks of "uncertainty".

Science has to pass some kind of plausibility test beyond "anything is possible" thinking--(it is "possible" my dog is telling me to write this note). Asteroid science passes the plausibility test. Predictions about the temperature 3 years or 30 years from now don't--until proven otherwise. And it is simply not good enough to say "by then it will be too late".

dkite writes:

Your illustration of a transmission problem may be more apt than you realize.

If you notice a problem in a complex system, there usually are three ways it works out. Sometimes it goes away. Spontaneous remission. Sometimes it just keeps going, doing the same thing without change. And sometimes a minor failure causes a series of cascading failures that ends in catastrophe.

The challenge is knowing which.


steve writes:

To me, the problem with the AGW debate has little to do with the validity of AGW. Rather the problem is the assumption, nay the certainty of those who believe in the argument, that socialism can actually solve the problem.

Whearas experience shows that every task, with the possible exception of intentional destruction, socialism tries to undertake fails miserably. Socialized food production results in famine, but somehow socialized enviornmentalism is gonna work.

Eric Johnson writes:

There is definitely uncertainty about how much we may heat up. This was addressed by an IPCC report I read part of. Many of the very worried are worried primarily because of positive feedback super-disaster scenarios, which they dont think are anywhere near certain to happen.

Pete writes:

I imagine you won't like Real Climate, but William Connolley is very good ( Also Tamino is great ( This doesn't have to do with uncertainty, but I think everyone should read his post on climate models- (

David C writes:

"And, of course, current and projected CO2 levels are not unprecedented in history, so you can't claim that we are in danger of exceeding the "design envelope" of the system." - Rafal Smigrodzki

This is a misleading statement. Current CO2 levels are not unprecedented in Earth's history, but they are unprecedented in human history.

As far as uncertainty as defined by skeptics goes, I think it boils down to this.

Global warming is primarily caused by human emissions and this will have an overall negative impact on our society. Therefore, reducing human emissions will reduce this impact.

Given the degree of consensus among climatologists there appears to be on this issue ( disagreeing with the above statement seems slightly more reasonable than arguing that free trade is bad for the economy, particularly from a lay person's perspective. If you agree with the above statement, then the skeptic's argument about uncertainty becomes invalid, and Tyler and Brad's argument about uncertainty takes hold.

Also, from a libertarian perspective, why is this a major issue? Liberals have the possibility of hundreds of trillions of dollars lost ( and the death of hundreds of millions of people to worry about, and libertarians are worried about a few billion a year in losses to the economy. The Waxman-Markey bill is estimated by the CBO to cost the economy about $2.2 billion a year without considering benefits from climate mitigation. A large government ramp up of global warming preventions is such a low possibility, more gains could be had spent worrying about the budget of NASA.

Yancey Ward writes:

Well, there are an infinite number of plausible risks that one can envision that run the gamut of killing 10% of the people on the planet and killing them all. We got a lot to do to to make sure they don't happen. Lets get started ASAP.

Ben Kalafut writes:

See the 2007 paper of Gerard Roe and Marcia Baker for a quantification of the kind of uncertainty that is at work here.

The methodology is straightforward and sound.

Note the shape of the distribution. It seems that the "skeptics"--and I hesitate to use that term as it implies the mainstreamers are bad scientists--are misrepresenting uncertainty as being symmetric and centered on zero.

aaron writes:

Don't forget, there are two tails. We tend to be biased and look at one too closely.

Evidence suggests we've been looking at the wrong tail. The cold tail is at least as likely and more costly.

Any amount of GW we are likely to produce will be beneficial and also thin the cold tail at least as much as it increases the hot tail.

Getting some more hot tail sounds good to me.

Comments for this entry have been closed
Return to top