Bryan Caplan  

3 Answers from Robin

Misallocation under Rent Contr... What to Do about London Housin...
Robin's answers to my three questions:

Let me emphasize yet again that I present myself in the book as an expert on facts, not on values, so my personal values should not be relevant to the quality of my book.

The fact that you ask #3 suggests you seek an integral over the value of the entire future. In which case I guess I need to interpret #1 as no age of em or any form of AI ever appearing, and it is hard to interpret "hell" as I don't know how many people suffer it for how long.

But if 0 is the worst case and 10 is the best case, then humans continuing without any AI until they naturally go extinct is a 2, while making it to the em stage and therefore having a decent chances to continue further is a 5. Biological humans going extinct a year later is 4.9.

This is as I expected and feared.  Given Robin's values, The Age of Em's disinterest in mankind makes perfect sense.  But to be blunt, treating imminent human extinction as a minor cost of progress makes Robin's other baffling moral views seem sensible by comparison. 

In practice, I'm sure Robin would be horrified to see robots wipe out everyone he knows.  Why he sees no need to reconcile his near-horror with his far-optimism is a deep mystery to me.

To be clear, I'm not even slightly worried about robots wiping out mankind.  But if it happened, it would be the worst thing that ever happened, and an infinite population of ems would not mitigate this unparalleled disaster.

Comments and Sharing

COMMENTS (5 to date)
Mike H writes:

Thank you for helping me sleep at night!

Maybe I should plan a fifth kid to truly thank you. (I only have two now, but want at least four.)

(My wife talked me up from two, you talked me up from three.)

Peter Gerdes writes:

Seems totally reasonable as long as you believe ems have experiences. I'm hardly upset that we no longer have people with a wide variety of preventable birth defects so why would one mind getting rid of this flabby sack of flesh that makes us die.

I mean surely you don't think that artificial arms and legs are a horror. Even if we replaced everyone's arms and legs with better artificial ones. What makes replacing brain cells with silicon chips any different as long as they perform the same function (both I/O wise and in terms of generating qualia).

Also: Do you really think it is possible for people to go extinct once ems exist?

I mean brain scanning seems a lot more difficult than creating synthetic sperm and egg cells, assembeling a human genome and bringing humans back around as desired. The sperm and egg cells don't need to be perfect, just good enough that the correct DNA nudges the daughter cells closer to the right thing.

Robin Hanson writes:

Your horror only makes sense to others who share your strange conviction that the only creatures who are conscious or have moral value are those made of many atoms whose nuclei have exactly eight protons. This seems to me a bizarrely numerological moral theory.

Robin Hanson writes:

How can you possibly accuse em of indifference if I say an event changes the total value of the future universe from 5.0 to 4.9? That is a HUGE harm if the total value of the future universe is huge, which it is.

Jared writes:

Thought experiment: you wake up tomorrow and everyone in the world has been downloaded in computers. Mentally, they are exactly the same in every way and the cost of robot bodies is negligible. On a scale of 1-10,1 being dystopia and 10 being utopia, how terrible is this?

Comments for this entry have been closed
Return to top