Let me emphasize yet again that I present myself in the
book as an expert on facts, not on values, so my personal values should
not be relevant to the quality of my book.
The fact that you ask #3 suggests you seek an integral over the value
of the entire future. In which case I guess I need to interpret #1 as
no age of em or any form of AI ever appearing, and it is hard to
interpret "hell" as I don't know how many people suffer it for how long.
But if 0 is the worst case and 10 is the best case, then humans
continuing without any AI until they naturally go extinct is a 2, while
making it to the em stage and therefore having a decent chances to
continue further is a 5. Biological humans going extinct a year later
This is as I expected and feared. Given Robin's values, The Age of Em's disinterest in mankind makes perfect sense. But to be blunt, treating imminent human extinction as a minor cost of progress makes Robin's other baffling moral views seem sensible by comparison.
To be clear, I'm not even slightly worried about robots wiping out mankind. But if it happened, it would be the worst thing that ever happened, and an infinite population of ems would not mitigate this unparalleled disaster.