Arnold Kling  

Global Warming, con't

PRINT
Bloggers' Class Autobios... My Future Calhounian Class Aut...

A new study says


Three top climate researchers claim that the greenhouse gases already in the atmosphere should have warmed the world more than they have. The reason they have not, they say, is that the warming is being masked by sun-blocking smoke, dust and other polluting particles put into the air by human activity.

But they warn that in future this protection will lessen due to controls on pollution. Their best guess is that, as the mask is removed, temperatures will warm by at least 6°C by 2100. That is substantially above the current predictions of 1.5 to 4.5°C.


This makes more sense to me than the consensus forecast. As I pointed out in previous posts, human activity has grown exponentially over the past century, yet the consensus model of global warming is approximately linear--going backward as well as forward.

To me, this is bizarre. I think it's fairly common to have a linear treatment variable and a linear response variable. I think it's fairly common to have exponential treatment and exponential response. And I think it's even common to have linear treatment and exponential response. But exponential treatment and linear response?

The only example I can come up with is an economic example--diminishing returns. You could have exponentially increasing labor on a plot of land, and only linear increases in output. But why should the human impact on climate exhibit diminishing returns--that is, the more we spew into the air, the less effect our spewing has?

Instead of a consensus around a linear forecast, it seems to me that we should have divergence involving various nonlinear forecasts. Maybe that sort of divergence would confuse the general public. But it would make more sense to me!


Comments and Sharing





COMMENTS (7 to date)
Brad Hutchings writes:

Exponential treatment, linear response... How about binary search? Pick a number from 1 to 32. I can guess it in 5 guesses using binary search and you telling me if my guesses are too high or too low. Pick a number from 1 to 1024. I can guess in 10 guesses. Pick a number from 1 to 1,048,576. I can guess in 20 guesses. Pick a number from 1 to 1,073,741,824. I can guess in 30 guesses. Binary search is what computer science types describe as an O(log n) algorithm.

I wonder if in the context of global warming, what you describe as exponential is really just a faster than linear polynomial, e.g. n^2 or n^1.03. Exponential usually means c^n where c is some constant in mathematical treatments of things. Exponential things grow mighty fast. In dot come culture, exponential means anything that actually grows. So I can see where confusion over the term pops in.

Barkley Rosser writes:

If you want to see confusion, go to Prometheus and see the debate over exactly what the Hansen model of 1988 predicted and whether or not Patrick Michaels distorted his projections in his 1998 testimony. A lot of the arguments involve people confused over whether assumptions are linear or exponential or what.

Regarding this latest bit, some have argued that sulfates and aerosols were why global temp dropped between 1940 and the 1970s. That it started to go up then may have been due to our successes in reducing SO2 and aerosols while CO2 and other GHGs continued to soar.

OTOH, Hansen himself argues that there may be a warming aspect to aerosols, particularly those that are black carbon. Reducing those could help offset warming. In short, this is all pretty messy.

pj writes:

That's no mystery. CO2 is a linear molecule with a small number of absorption lines in the infrared, and they are all saturated already. This means that adding more CO2 increases absorption only among molecules at the extreme edges of the velocity distribution, and that absorption increases as the log of CO2 abundance. This is exactly what is required to convert an exponential driving force to a linear response.

In any case it's not obvious that the CO2 abundance will grow exponentially.

Tom writes:

J,

Thanks. That is a clear answer I can understand.

silvermine writes:

But the problem is that most scientists these days seem to only understand how to run a linear fit program, without understanding what they're doing. Hence why so many "models" look so wrong...

Glen Raphael writes:

I don't remember where I saw this analogy but I rather liked it: windowshades!

Each time you add another set of drapes in front of a window you block a little more light from getting in or out of the room. Once the room is totally dark it doesn't matter how many more layers of drapes you add, you're not blocking significant extra light with the latest new addition.

Similarly with CO2 there is a natural limit to the effect it can have. You can't soak up more than all the energy in the frequency bands it absorbs. So there has to be some sort of function with diminishing returns on the margin as we approach saturation.

M L writes:

Glen,
Maybe you saw the window shade analogy
at www.junkscience.com/Greenhouse/

I found it to be an interesting read.

Comments for this entry have been closed
Return to top