Bryan Caplan  

Robin's Turing Test

Tyler Cowen on low interest ra... The Economist and Star Trek Be...
Robin recently tried an Ideological Turing Test for a subset of my critique of his Age of Em.  He understands me better than before, but there's still room for improvement.  "Correct" means Robin has described my view to a tee. 

Bryan sees sympathy feelings as huge influences on social outcomes. Not just feelings between people who know each other well, but also distant feelings between people who have never met.


For example, if not for feelings of sympathy:

1. Law and courts would often favor different disputants.

2. Free workers would more often face harsh evaluations, punishments, and firing.
3. Firm owners and managers would know much better which workers were doing good jobs.
Partly correct.  The primary issue is that firm owners and managers are squeamish about acting on their knowledge.  But if they were less squeamish, they would admittedly be more interested in acquiring knowledge.
4. The US would invade and enslave Canada tomorrow.
Incorrect.  There are also plenty of prudential reasons not to invade and enslave Canada.  My claim, rather, is that if the US could profitably invade and enslave Canada, it still wouldn't do it.  Indeed, few Americans would consider it.
5. At the end of most wars, the victors would enslave the losers.
Partly correct.  Most wars don't end in abject defeat of the losers.  But without sympathy, abject defeat would lead to slavery, yes.
6. Modern slaves would earn their owners much more than they would have as free workers.
Correct.  Workers' credible threat to quit has massive distributional effects.  How could it be otherwise?
7. In the past, domestic, artisan, and city slaves, who were treated better than field slaves, would have been treated much more harshly.
Partly correct.  Domestic slaves were better treated because most people have greater sympathy for people they see every day.  (Field slaves' overseers saw them every day, too, of course, but overseers were selected for their lack of sympathy).  In contrast to domestic slaves, however, I think artisan and city slaves were better treated because of imperfect information about productivity.
8. The slave population would have fallen less via gifts or purchase of freedom.
Partly correct.  Sympathy explains most of the gifts, but with imperfect information there is a selfish reason to give slaves some positive incentives, which ultimately allowed some to purchase their freedom.
9. Thus most of the world population today would be slaves.
Incorrect.  There would have to be numerous major wars ending in abject defeat for large populations for this to happen.
Of course even if Bryan were right about all these claims, he needn't be right in his confident opinion that the vast majority of biological humans will have about as much sympathy for ems as they do for mammals, and thus treat ems as harshly as we treat most mammals.
Actually, I expect we would treat ems worse than we treat non-human mammals - closer to the way we treat videogame characters.  The animal/machine divide mightily influences human feelings.  Check out every Twilight Zone episode where attitudes change on a dime once people realize that something that looks human on the outside is only machinery on the inside. 

This sympathy-driven view doesn't by itself predict Caplan's strong (and not much explained) view that ems would also be very robot-like. But perhaps we might add to it a passion for domination - people driven by feelings to treat nicely creatures they respect might also be driven by feelings to dominate creatures they do not respect.

It's even simpler.  Docile slaves are more profitable than slaves with attitude, because owners don't have to use resources to torture and scare them into compliance.  That's why owners sent rebellious slaves to "breakers": to transform rebellious slaves into docile slaves.  Sci-fi is full of stories about humans genetically engineered to be model slaves.  Whole brain emulation is a quicker route to a the same destination.  What's the puzzle?

COMMENTS (6 to date)
john hare writes:

#6 isn't correct. Slaves are less productive than free workers. I've worked one slave (work release from prison) and don't care to try it again. He couldn't afford to disagree with me because even a BS complaint would send him back to real lock up. Result is that he was less productive than later when he worked for me as a free man.

Anyone that thinks that slavery is cost effective in modern times hasn't thought about the knock on effects.

Miguel Madeira writes:

"#6 isn't correct. Slaves are less productive than free workers."

But the difference is higher (or even equal) than the difference between the cost of maintaining (feeding, housing, etc.) a slave and the wage of a free worker? If not, having slaves has a better cost/benefit relation than having free workers.

What can be argued is that the advantage of less monthly/daily/year costs (and, btw, the lower productivity) is already discounted in the price of the slave, making the cost (including the opportunity cost of not selling him) of a slave for each individual owner similar to the cost of a free worker.

Thomas B writes:

I think John Hare is right, or largely so.

If a task is easily measured, like, say, picking cotton, then slavery may work because the slave has no way to slack off without getting caught.

But if there are benefits to the application of creativity, and/or the output is difficult to measure, the free worker - treated well - will be able to add value voluntarily. And, of course, monitoring costs are much lower when your workforce isn't actively trying to run away, or destroying valuable assets for revenge (one of the reasons for NOT wanting conscripts in the modern military is that soldiers are left alone with VERY expensive pieces of machinery).

In the modern world, the number of jobs where the slave would be more profitable is probably quite small.

Franz writes:

Second to las claim (after number 9) reminded me of an episode of Black Mirror, a British sci-fi show where there is a robot standing as a human. The episode is called "Be right back", and though the analogy with the point made in the post is far from perfect, it does raise the point that humans would feel sympathy for robots that are human-like, even if we are fully aware of the fact that they are robots.
Another example of the same point can be found in the movie A.I. There, even one as a mere spectator fins onself sympathizing with the robots.

Hesse Kassel writes:

In terms of straight work output slaves should be at least as productive and probably more productive than free, paid workers under all conditions.

This is because the slave owner has the option of using all the incentives that can be offered to free workers, plus more. Of course, slave owners don't usually offer all those positive incentives because he doesn't have to and because there are no other, competing offers coming to the slave, but he could.

There are offsetting factors because there is always a cost involved in enforcement of slave status and possibly some personal risk to people owning or using a slave.

These costs and risks typically rise as slaves approach closer to people of value to the owner. This creates an incentive for owners to use more positive incentives with domestic slaves then with field slaves.

The reason for this is that well treated slaves have more to fear from the consequences of escape, rebellion or attack.

Kenny writes:

This is at least the second time I've noticed you using fiction as evidence! Twilight Zone episodes are not strong evidence - they're extremely weak evidence - that people would treat ems harshly. Besides, I read a short story about a robot similar to a Roomba that was programmed to shriek in pain when hit or damaged and designed to 'bleed' when punctured. The human character was wracked with guilt for hurting the robot. That seems like just as strong as your evidence that people are capable of sympathy towards 'machines', at least when they act similar to us.

Are you really so sure people, in general, will treat harshly 'machines' that can literally beg for their 'lives', cry in anguish or despair, and contact their local government representatives?

Comments for this entry have been closed
Return to top