David R. Henderson  

Henderson on Robots, Jobs, and Productivity

PRINT
Fifty years ago... Worstall on Robots and Jobs...
If you're still worried that robots will be too human-like, consider what happened to men's jobs when women, who not only are human-like but also are actual humans, increasingly entered the labor force. Men's jobs didn't decline; they increased. In 1950, before the large entry of women into the U.S. labor force, 43.8 million men and 18.4 million women were employed. By 2015, women's employment had skyrocketed to 78.0 million, while men's employment, far from shrinking, almost doubled to 84.4 million.

The simple fact is that the amount of work to be done in the economy is unlimited. What's limited is the number of humans, which is why the late population economist Julian Simons called humans, in a book by the same name, the "the ultimate resource." There's a story--perhaps apocryphal but no less insightful for that--about an American engineer visiting China in the 1960s, when the Chinese government was building a dam. The American, noting the large number of workers digging with shovels, told his Chinese host that the digging could be done more quickly if the Chinese used steam shovels. "Oh," answered the host, "but then there would be fewer jobs." "I didn't realize that was the goal," answered the American, "but if your goal is jobs, you might consider replacing the shovels with spoons."

What this story illustrates is that although jobs are important for creating value, if we can create the same amount of value with less input, it's wise to do so. Who, for example, wouldn't want an innovation that allowed them to do their current job and be paid just as much, while working half the time? This is not a fantasy. Pay is closely tied to productivity. The hypothetical innovation would destroy "half a job." And we would love it. We would use that freed-up time for leisure, or, more likely given our unlimited wants, for doing other work that gives us pecuniary rewards. That is the story of economic growth.


This is from David R. Henderson, "Will Robots Steal Human Jobs?" Defining Ideas, August 17, 2017.


Comments and Sharing






COMMENTS (15 to date)
Market Fiscalist writes:

While what the post talks about is of course possible I think a more dystopian outcome is also possible.

Suppose technology advances to the point that there are robots that can produce everything including other robots.

At this point human productivity has become very high. But might it become possible that entrepreneurs may move to a capital-only economy and decide that hiring humans with all their failings is not worth the bother ? Capital owners become very wealth. Non-capital-owner have to go back to subsistence levels of existence.

What economic laws guarantee that this outcome couldn't occur ?

JFA writes:

While I tend to be a little more optimistic than most, I think there is some cause for concern. First, I know that you are making the point that economic activity is not a fixed quantity, but when you make a big deal out of male employment doubling without mentioning that the total population also more than double, it seems a bit misleading. Second, there has also been about a 10 percentage point decrease (98 to 88) in the male (aged 25 to 54) labor force participation rate between 1954 and 2015. You can interpret that a few ways, but I would be reluctant to put to much of an optimistic spin on it.

TM writes:

Yes but prime age male workforce participation rate is significantly down since then:

https://data.bls.gov/timeseries/LNS11300061

And real wages have been flat.

Thaomas writes:

I agree with your big point, but

"Pay is closely tied to productivity."

Is not necessarily true in theory and not true in practice since ~19XX. You must have seen the graph.

Jerry Brown writes:

This Henderson you are quoting is a pretty smart guy and I think he is right when he says "The simple fact is that the amount of work to be done in the economy is unlimited."

But the fact that there is an unlimited amount of work that could be done does not translate into unlimited amounts of work that can be made a profit out of at the same time as providing a decent existence for the citizens who would do the work. Does he have any thoughts on that?

@David Henderson
Wonderful analogy. Thanks for the optimism, and for crediting Julian Simon — a hero.

Jerry Brows asks:

But the fact that there is an unlimited amount of work that could be done does not translate into unlimited amounts of work that can be made a profit out of at the same time as providing a decent existence for the citizens who would do the work. Does he have any thoughts on that?
I have many thoughts on that. In fact I have what I believe is a whole new scientific theory, a new model for understanding living processes. These living processes include (of course) the humans and jobs at the heart of your question.

The theory, which I call the Resource Patterns Model of Life, suggests human life will continue: free, safe, and wealthy in many ways; but also controlled and overpowered in ways we hate to consider.

For a chilling analogy, consider the present fate of single-cellular organisms which were the highest form of life on Earth one billion years ago. Single-cellular organisms live on today all around us. But the single cells, also descended from those of a billion years ago, which today have probably the highest standard of living, live inside our bodies, incorporated in an encompassing order (organism), and all sharing the same command set (mind?) of chromosomes.

Don Boudreaux writes:

Thaomas:

I'm unsure just what you mean when you say that the link of worker pay to worker productivity "is not necessarily true in theory." Of course, theories can and have been offered in which this relationship doesn't hold. Most famously, such a theory was offered by Karl Marx. But in mainstream (and what my colleague Peter Boettke calls "mainline") economic theory, this link is pretty tight.

It's true that in mainstream (and, especially, mainline) theory there can be disequilibria in this and in all other economic relationships. It's true also that mainstream and mainline theories generate their conclusions necessarily with foreground and background assumptions (as does any theory). Such assumptions in this specific case include the absence of significant restrictions on the ability of workers to change jobs, on employers to recruit workers, and on employers and workers to bargain over pay and work conditions. But given these and other assumptions (which in most instantiations of the theory track reality reasonably well), the conclusion that worker pay is linked pretty tightly to worker productivity does hold "in theory."

And it holds in practice. Because this link (it's to a piece that Liya Palagashvili and I had a few years ago in the Wall Street Journal) is behind a paywall, I quote here the first several paragraphs:

Many pundits, politicians and economists claim that wages have fallen behind productivity gains over the last generation. This "decoupling" explains allegedly stagnant (or in some versions of the story, declining) middle-class incomes and is held out as a crisis of the market economy.

This story, though, is built on an illusion. There is no great decoupling of worker pay from productivity. Nor have workers' incomes stagnated over the past four decades.

The illusion is the result of two mistakes that are routinely made when pay is compared with productivity. First, the value of fringe benefits—such as health insurance and pension contributions—is often excluded from calculations of worker pay. Because fringe benefits today make up a larger share of the typical employee's pay than they did 40 years ago (about 19% today compared with 10% back then), excluding them fosters the illusion that the workers' slice of the (bigger) pie is shrinking.

The second mistake is to use the Consumer Price Index (CPI) to adjust workers' pay for inflation while using a different measure—for example the GDP deflator, which converts the current prices of all domestically produced final goods and services into constant dollars—to adjust the value of economic output for inflation. But as Harvard's Martin Feldstein noted in a National Bureau of Economic Research paper in 2008, it is misleading to use different deflators.

Different inflation adjustments give conflicting estimates of just how much the dollar's purchasing power has fallen. So to accurately compare the real (that is, inflation-adjusted) value of output to the real value of worker pay requires that these values both be calculated using the same price index.

Consider, for instance, that between 1970-2006 the CPI rose at an average annual rate of 4.3%, while the GDP deflator rose only 3.8%. Economists believe that such a difference arises because the CPI is especially prone to overestimate inflation. Therefore, much of the increase in the real purchasing power of workers' pay is mistakenly labeled by the CPI as mere inflation.

Mr. Feldstein and a number of other careful economists—including Richard Anderson of the St. Louis Federal Reserve Bank and Edward Lazear of the Stanford University Graduate School of Business—have compared worker pay (including the value of fringe benefits) with productivity using a consistent adjustment for inflation. They move in tandem. And in a study last year, João Paulo Pessoa and John Van Reenen of the London School of Economics compared worker compensation and productivity in both the United States and the United Kingdom from 1972-2010. There was no decoupling in either country.

Tom DeMeo writes:

There are two factors that are missing here.

The first is the pace of change. Absorbing 78 million women into a workforce over 65 years is not a useful analogy. What would have happened if we absorbed that same number over 6 months? Human markets can not adapt to change as fast as technology can scale.

The second is what this does to the balance of power and violence. The current equilibrium will change dramatically.

JFA writes:

@Professor Boudreaux: I tend to be a wage optimist (though not as much as you). While fringe benefits are obviously important, I think there is a case to be not so optimistic as your assessment. One, because employer provider health insurance is subsidized, the increase in the amount of the employees' income coming from that source does not necessarily mean that employees experience a dollar-for-dollar gain in value. Second, because of the difference in the increase of medical prices vs. other things, it might be more accurate to use two different deflators when measuring employee compensation growth, say CPE excluding healthcare to deflate cash income and a medical care/health insurance price index to deflate fringe benefits (crude since there are some fringe benefits that come in the form of retirement matching, but there is probably data somewhere that would allow you to disaggregate health insurance from other fringe compensation (maybe the MEPS survey of employer insurance).

Todd Kreider writes:
"Men's jobs didn't decline; they increased. In 1950, before the large entry of women into the U.S. labor force, 43.8 million men and 18.4 million women were employed. By 2015, women's employment had skyrocketed to 78.0 million, while men's employment, far from shrinking, almost doubled to 84.4 million."

But the population also doubled:

1950: 150 million
2015: 315 million

Mark Bahner writes:

Hi,

I've said a lot of this before, but...

1) The effect of AI on the economy and human employment is the most important question in economics. Economics blogs should be dedicating even a majority of their posts to the issue. That's not happening.

2) This is not some distant future thing. This is something that's going to explode in the the next 10-30 years. That's because the cost per million instructions per second of computer power continues to be cut in half in just a little over 12 months' time. Somewhere in the next 5-10 years, the cost per 20 quadrillion instructions per second (roughly the power of the human brain) will reach $1000. And then just a little over 10 years later, it will be $1. (That's the power of exponential change that occurs on short timescales.)

3) Circa 2004, I was predicting a world per-capita economic growth rate of more than 10 percent per year by 2050. But that was before I really looked carefully at the rate of change in computer hardware. I think the change will occur sooner than that.

4) So I think that the long-term outlook is good. (As Keynes said that in the long-run we're all dead, I say in the long run, no one will need to to worry about work.

5) The transition to utopia (utopia, barring Terminators!) is potentially going to be very rough. One case I think can be seen very clearly already is that autonomous delivery vehicles will destroy brick-and-mortar retail. In less than 30 years, I think that more than 90 percent of all brick-and-mortar retail facilities of all types (Walmart, Target, Kroger, Walgreens, Rite Aid, Lowes, Home Depot, etc. etc.) will be shut down. They will be replaced by home delivery via autonomous vehicles picking up from largely automated warehouses. So that means that all those people in all those stores will no longer have jobs. I think it's a total cop-out for any economist (or anyone else) to say, "Oh, it'll be fine...they'll find other jobs!" without at least giving some hint of what those other jobs might be. So that's my challenge to you, David. And to Tim Worstall. The change is clearly coming to retail. When all those people lose their jobs, what jobs could they move to? (Keep in mind that they'll have tremendous competition from all the other people who have their exact same experience and general skill level who have lost their jobs.)

Todd Kreider writes:
3) Circa 2004, I was predicting a world per-capita economic growth rate of more than 10 percent per year by 2050. But that was before I really looked carefully at the rate of change in computer hardware. I think the change will occur sooner than that.
I was in future mode for a bit in 2003 and also arrived at 10% growth in 2050 yet didn't think further out than that since I had already thought in the late 1980s that a computer-driven Singularity of some type would be possible, not inevitable, by 2040 to 2060.

When you have written "within 30 years" for 90% of brick and mortar retail to end, I've been curious why you don't think likely sooner than before 2047. If the exponential curve goes out to 2030, I'd think this would happen closer to within 15 to 20 years.

I agree with you about unemployment and in the 1990s assumed jobs will still be created about as fast as old ones end just as Kurzweil has argued. But in 2009, I don't think related to the recession, I began to think there could be a really rough patch of 5 to 10 years - essentially the 2020s - although again do not think inevitable. Kurzweil has said we'll all make a living off of our own websites but the truth is that if technology races ahead then many jobs would be unknown.

I've read economists like Krugman for 25 years and more recently Cowen and Gordon on technology, and it is easy to see that they, with no science or computer science background, simply don't think of future technology very deeply and in fact have usually mocked it.

Mark Bahner writes:
When you have written "within 30 years" for 90% of brick and mortar retail to end, I've been curious why you don't think likely sooner than before 2047. If the exponential curve goes out to 2030, I'd think this would happen closer to within 15 to 20 years.

Yes, I was simply nervous about predicting 15-20 years from now, given the current situation that there isn't a single autonomous delivery vehicle on the road anywhere in the world. :-)

If we figure 3-7 years for the first fully autonomous delivery vehicles to appear, and then consider that some brick-and-mortar places could compete for a while by having employees go through the store and load stuff into the autonomous delivery vehicles, is seems to me that 15-20 years for 90% of brick-and=mortar stores to disappear might be too soon. It also seems like a lot depends on how quickly the robots depositing things onto the shelves and things off the shelves develop. If you think of a Walmart that has fresh produce (e.g. tomatoes, watermelons), frozen foods, over-the-counter drugs, office chairs, TVs, etc. etc., that's a lot of different items for a robot to pick up and deliver to the front of the warehouse. (Let alone into a delivery vehicle.)

But the thing about exponential growth with doubling times of 1-2 years is that things evolve very quickly. One year nobody has stock handling robots, then a couple years later everyone has them.

Mark writes:
I've read economists like Krugman for 25 years and more recently Cowen and Gordon on technology, and it is easy to see that they, with no science or computer science background, simply don't think of future technology very deeply and in fact have usually mocked it.

I have no comments on Krugman or Gordon, but I'll put in a good word for Tyler Cowen on the impact of technology on labor. He wrote:

And if you are an optimist about the cost of producing copper, tin, and steel, you probably should be an optimist about the cost of bringing more humans into the supply chain for labor.

And...

There is simply no reason for the technological optimist to think the cost of reproducing labor and labor substitutes should remain high forever. The higher are real wages, the greater the pressures for such innovation!

I think that's absolutely correct. And given the trends in computer technology, the most likely substitute for human labor is AI.

P.S. At the end of that post, he wrote:

For a useful conversation related to this topic, I thank Bryan Caplan, John Nye, and Robin Hanson, can you guess which one disagreed with me most?

My guess was Bryan...but I don't think Tyler ever said what the correct answer was.

Comments for this entry have been closed
Return to top