Arnold Kling  


Response to Bryan on ZMP... Youth Unemployment, Continued...

That is, "The Great Stagnation" and "Race Against the Machine." Tyler Cowen writes,

I understand how the TGS argument fits into the cyclical story of 2007-2011 (excess confidence and overextension, Minsky moment, AD contraction, AS problems slow down the recovery), but I don't understand how the Race Against the Machine story interacts with the cycle, or if it is even supposed to.

Read the whole post. I could not decide what to excerpt. My answer would be that RATM describes the last fifteen years, except that in the late 1990s and in 2005-2006 we experienced false booms due to exuberance. Rather than creating sustainable patterns of specialization and trade, we created unsustainable ones, particularly in 2005-2006, and that is why we had such a hard landing. The late 1990s boom was not nearly as distortionary, for a number of reasons. Laying extra fiber-optic cable a few years early was disastrous for some companies, but not for the economy as a whole.

I think that the most interesting challenge posed by Tyler in the video of the year concerns the health care and education sectors. Why are they not improving? Some possibilities:

1. They are hard to improve because people are harder to reshape than things (Tyler makes this point).

2. They are more complex, and we need to see more iterations of Moore's Law before computers can help (I think Tyler also takes this point of view).

3. The incumbents are better at fending off innovation, by using credentialism and persuading consumers that it is risky to adopt new processes. This is the "fortified towns" story of my jobs speech.

My opinion is that the chances are increasing that we will see sudden "tipping" in education away from traditional models. I think that the technology is pretty much here to do better than the old-fashioned classroom. It's being held back by the incumbents, but they are going to lose, just as the music publishers have been losing and the book publishers have been losing.

In health care, I am not sure that all of the necessary technology has arrived to replace your doctor with a computer that uses DNA, scans, and blood samples to develop treatment plans. But I would estimate that the chances are greater than 50-50 that we will be there within a decade. Again, there will be adoption lags. I expect the medical profession to undertake an all-out effort to raise fear, uncertainty, and doubt about technology to replace the doctor, until eventually people realize that they have more fear, uncertainty, and doubt about doctors themselves.

COMMENTS (9 to date)
Nathan Smith writes:

It seems to me that the idea that health care is an entitlement comprehensively distorts the health care industry with perverse incentives. Since the incentive for consumers to control cost is taken away, innovators have an incentive to invent just the things that will yield maximum dollars per patient. Maybe we'll have to rely on developing countries where medicine is not distorted by social safety nets to yield new answers. How many people would get open heart surgery in the US, rather than in India for $2,000, if they had to pay out of pocket?

Transcriptome writes:

> In health care, I am not sure that all of the necessary
> technology has arrived to replace your doctor with a
> computer that uses DNA, scans, and blood samples to
> develop treatment plans. But I would estimate that the
> chances are greater than 50-50 that we will be there
> within a decade.

Arnold, don't be offended but you obviously don't know anything about medicine. Sorry.

Thomas DeMeo writes:

The options for entrepreneurs to pursue opportunities without needing to risk hiring permanent employees has exploded in the last several years. It wasn't so long ago that you had to hire people to take a shot at starting most businesses. Now you don't.

Rick Hull writes:

Regarding Moore's Law, we are *well* into the elbow of diminishing returns. We can keep making the knife sharper, but for all but the most specialized of tasks, it was sharp enough 10 years ago. We are running into physical limits of size and energy/heat. Which is not to say that some amazing tech won't burst onto the scene.

But the limitation in the advancement of computer technology is not how many transistors we can pack into a discrete chip, rather it's harnessing that power. If only Software Engineering kept pace with Moore's law.

As well, I think there are some fundamental limits to human abilities in regards to managing complexity.

Thus, advancements in really complicated things due to technology will not be driven by simple metrics like Moore's law.

Slocum writes:

I think health care and education have to be separated because health care actually *is* improving and because there #1 applies (e.g. new drug discovery is very hard).

For education, though, I'd vote heavily on #3. We already have all the tools we need to vastly improve the efficiency of education, but the incumbents are powerful and entrenched and there's also a lot of cultural path dependence and risk aversion.

Mr. Econotarian writes:

I am not sure that all of the necessary technology has arrived to replace your doctor with a computer that uses DNA, scans, and blood samples to develop treatment plans.

Clinical Decision Support systems (CDSs), software designed to assist physicians and other health professionals with decision making tasks such as diagnosis, have proven to aid doctors - from Wikipedia:

A 2005 systematic review by Garg et al. of 100 studies concluded that CDSs improved practitioner performance in 64% of the studies. The CDSs improved patient outcomes in 13% of the studies.

Yet no doctors I know regularly use them. In addition, I have only known one doctor that used electronic communications to the pharmacy rather than error-prone hand-written notes.

J Storrs Hall writes:

There is a word, and concomitant cultural protection, for the two sectors that you mention. Education and medicine are priesthoods, in a way that other "walled towns" such as union labor are not. People tend to idolize, or at least look up to, doctors and professors in a way that causes the suspension of judgement and allows the widespread belief that any attempt to change the current system is an attack on learning or health itself.

John writes:

As a minority PhD computer scientist who is a professional software developer and not an academic, I've been following this issue for all of the past 3 decades and then some, screaming like Chicken Little, and I'm too weary of it now to comment here in depth - I could write books about it - maybe I should.

But I'd like to take a stab at your question: why are healthcare and education not improving?

The basic economic issue is not the objective reality of the impact of technology, but the perception of economic players about it - i.e., the conventional mindset. In healthcare and education, the objective truth about impacts on productivity are frankly opposite the perception. The conventional mindset is that technology improves productivity (and that it should be used to reduce costs - particularly labor costs, not increase production ceteris parabus), and that mindset holds in those two areas, but the truth is that it actually is having the opposite effect - it has been damaging to productivity, and for various good reasons. In the rest of the economy, the truth and the mindset line are positively correlated, not negatively as they are in those two areas.

I think the enlarging debate, though, otherwise misses far too much.

First, MOST of human intellectual activity which classifies as paid labor, is amenable to productivity improvements via IT. Medicine is a good example. Medical doctors have to be smart people, but medicine should be science, and science should be amenable to automation, more so than non-scientific or intellectually less precise endeavors. Think about it this way: creativity and the wide range of human personality are really only welcome in artists, and art careers aren't even considered as "real work." Most employers consider such human characteristics a bad thing on jobs that comprise specific and highly repetitive work, even in professional tier jobs like law and medicine. So the problem of automating most work is not really so hard in practice.

So it's a mistake to think that IT will not have significant impact until "autonomous automation" - self-managing robots - are more widespread. Very small and incremental improvements in technology are significant, and have been happening for decades.

Then consider the pace of economic growth generally - a few percent per year. Then consider measures like Moore's Law, that dwarf that pace. My two smartphones (Google's reference Nexus models) are both faster and have more storage than the biggest supercomputer in my state when I was a grad student, barely a decade ago. That supercomputer occupied a whole building in the state capital; I can put both of these smartphones in the same shirt pocket.

The mindset of those who deploy IT is not the mindset of the economist or financial expert. It's a mindset enthralled by the magic of technology, and thus an urge to be involved with it whether it's useful or not.

But the fundamental purpose of IT enables the focus of that awestruck mindset on the economic purpose of cost reduction, since that's the basic purpose of IT itself. I.e., if one wants to use the latest gadget, what will one use it for? Simple answer - to improve productivity not by producing more of anything or doing so many new and different things that make more profit, as doing things one already does but faster. The basis of productivity improvements due to IT are about speed, not machine intelligence: it does simple things 1000's of times faster than they can be done without it.

Bryan Willman writes:

I think that TGS, RATM, and perhaps the discussion in this post, are all suffering from problems of measurement and goal definition.

For example, counting the quantity of output in dollars and dividing that by wages. But aside from perhaps Arnold nobody gets paid for writing comments on this blog, so it doesn't show in GDP. But it's clearly output.

As for goal defintion, consider eduction:

I've come to think (as have perhaps others) that education's two main functions are

a. To prepare children socially to function in our society while keeping them safe.


b. To keep a lot of people employed and out of harm's way.

To the extent that a. is true, even perfect teaching computers will be of little effect, since having Johnny and Billy go through fight resolution, learn to play well together, and have what amounts to day care are dominating functions that will cost a lot of money and consume a lot of people. This is true whether Johnny and Billy learn math or not.

To the extent that b. is true, spending on teaching computers will be a net increase in education spending, and have a hard time improving the productivity of the institution.

We'll see.

Comments for this entry have been closed
Return to top