Bryan Caplan  

What's Really Wrong With Cryonics

PRINT
Bob Murphy on EMH... A Philosophical Bet...
"I don't want to achieve immortality through my work. I want to achieve it through not dying."
                                                          --Woody Allen

One of the most engaging after-lunch conversations of my life was when Robin Hanson sat me down and gave me the cryonics version of the Drake Equation.  The Drake Equation multiplies seven variables together in order to calculate the number of civilizations in our galaxy with which communication is possible.  The Hanson Equation, similarly, multiplies a bunch of factors together in order to calculate how many expected years of life you will gain by signing a contract to freeze your head when you die.

During his presentation, I noticed that Robin spent almost all of his time on various scientific sub-disciplines and the trajectory of their progress.  On these matters, I was fairly willing to defer to his superior knowledge (with the caveat that perhaps his enthusiasm was carrying him away).  What disturbed me was when I realized how low he set his threshold for success.  Robin didn't care about biological survival.  He didn't need his brain implanted in a cloned body.  He just wanted his neurons preserved well enough to "upload himself" into a computer.

To my mind, it was ridiculously easy to prove that "uploading yourself" isn't life extension.  "An upload is merely a simulation.  It wouldn't be you," I remarked.  "It would if the simulation were accurate enough," he told me. 

I thought I had him trapped.  "Suppose we uploaded you while you were still alive.  Are you saying that if someone blew your biological head off with a shotgun, you'd still be alive?!"  Robin didn't even blink: "I'd say that I just got smaller." 

The more I furrowed my brow, the more earnestly he spoke.  "It all depends on what you choose to define as you," he finally declared.  I said: "But that's a circular definition.  Illogical!"  He didn't much care.

Then I attacked him from a different angle.  If I'm whatever I define as me, why bother with cryonics?  Why not "define myself" as my Y-chromosome, or my writings, or the human race, or carbon?  By Robin's standard, all it takes to vastly extend your life is to identify yourself with something highly durable. 

His reply: "There are limits to what you can choose to identify with."  I was dumbstruck at the time.  But now I'd like to ask him, "OK, then why don't you spend more time trying to overcome your limited ability to identify with durable things?  Maybe psychiatric drugs or brain surgery would do the trick."

I'd like to think that Robin's an outlier among cryonics advocates, but in my experience, he's perfectly typical.  Fascination with technology crowds out not just philosophy of mind, but common sense.  My latest cryonics encounter was especially memorable.  When I repeated my standard objections, the advocate flatly replied, "Those aren't interesting questions."  Not interesting questions?!  They're common sense, and they go to the heart of the cryonic dream.

Personally, I'd really like to live forever - in the normal English sense of the phrase "live forever."  I wish cryonics could realistically offer me that.  Unfortunately, the sophistry of its advocates leaves me pessimistic.  If they had a ghost of a chance of giving me what I want, they wouldn't need to twist the English language.


Comments and Sharing





COMMENTS (67 to date)
Tom West writes:

Okay, Bryan, would you use a Star Trek transporter?

david writes:

Presumably there are levels of identifiability - levels of things which you identify as yourself. And presumably the most desirable level may be the not-dying, immortality in the common sense meaning of the word.

But that high standard means that you're likely to be dead by the time that comes about.

Zack M. Davis writes:

Disputes about the definition of what counts as really you get us nowhere; we should try to think about what actually happens, and then only afterward worry about what to call it. If there is a perfect simulation of you that feels subjectively indistinguishable from you, and remembers everything you remember--can you at least see why some people are tempted to say that the simulation is you, for all the purposes that matter? Even if you prefer to say that it's "only a simulation," what are you really disagreeing about? Once you know what kinds of experiences are happening at what times, there's no question left to ask, nothing left to be confused about.

Constant writes:

Bryan Caplan, you aren't criticizing cryonics. You think you are, but what you're criticizing is run of the mill perfectly reasonable philosophy of personal identity, assuming a materialist background. You may as well attack McDonald's because they use eggs in their Egg McMuffin and a chicken once bit you.

Miguel writes:

I usually agree with most of what you say, but this attack on cryonics is rather... weak.

In fact, you're not attacking cryonics at all - but uploading and whether it preserves personal identity.

*You* are just an instance of a program running in your brain. Why can't you accept the possibility of this program running on another substrate?

Of course, right now this program needs *your* brain as a processor/interpreter. You should consider the concept of a Moravec transfer (http://www.acceleratingfuture.com/michael/blog/2006/02/what-is-uploading/) to see how a successful upload might be achieved.

Carl Shulman writes:

Does adding an additional factor for the cryonics Drake equation (for rebuilding the brain and such using methods that you're not squeamish about) really push the expected utility of cryonics from positive to negative? I suspect this isn't your real problem with the idea.

Robert Johnson writes:

It seems pretty obvious that part of living (as opposed to just being alive) is to experience consciousness. If I upload myself, I will not have experiences through the simulation of myself (at least, I can't see how I could). Missing that one key ingredient leaves little more than identifying with 'durable things'. Not a compelling proposition, and certainly not one that needs cryonics in order to be accomplished.

Eric Falkenstein had a great post (http://falkenblog.blogspot.com/2009/11/innovation-is-not-rewarded.html) awhile back about how smart people can HELD BACK by their creative, and often just plain wrong, ideas. Maybe that's Robin's problem.

alex writes:

There is also the possibility that someone can upload Robin Hanson while he is still alive - for example, if a sufficiently powerful remote sensing technology was able to map his brain from something like an x-ray. Which would be the "real" Robin Hanson?

John Perry's A Dialogue on Personal Identity and Immortality is a very nice, brief book about this sort of thing.

h writes:

Which would be the "real" Robin Hanson?

If it thinks, speaks and writes like Robin Hanson, then who cares if it is the "real" one?

E. Barandiaran writes:

Bryan,
You argue against RH because he appears to have an "unnatural" idea of his own self. I've been thinking about what David Hume would write today if he were to update A Treatise of Human Nature. Any ideas?

Marshall writes:

Is the em constantly smiling and enjoying...choclate mousse? I would continue reading the em's blog productions...but I would no longer trust its psychology, as it is meatless and timeless.

Tiger-bot-Hesh writes:

I'll only upload my brain into a robot if I can upload into a robot cat.

Peter Twieg writes:

I agree with the criticisms here that the identity issues are only weakly related to cryonics... in general, I'd prefer to see cryonics justified in terms of "this person's characteristic data is worth saving" rather than arguments about the importance of preserving identity and avoiding death.

But to the identity issues, I disagree with both Robin and Bryan somewhat, at least insofar as I understand their arguments. To Robin's point, if it can be accepted that a physical copy of you (or just an emulation) would be you... well, I'm not sure why one should care more about the existence of this other you than you should care about the existence of anyone else. If you do, doesn't this lead to the implication that one should prefer to have many, many copies of himself living all sorts of lives throughout the galaxy? Does Robin's position not imply this?

On the other hand, I'm not a fan of Bryan's essentialism (sorry if this is a dirty word) that would lead to his rejection of cyborg Hanson as not the [i]real[/i] Robin Hanson. Perhaps in some sense this is defensible, but I'm not sure that this is something that one can justifiably care about - a continuity of identity seems at best to only be a legitimate interest to the person who experiences that identity, not external observers. If Bryan discovered that Robin were a P-Zombie (if he believes such a thing is possible), would he recoil in horror? If so, why?

Bret writes:

Hi Bryan,

Yes the majority of your post is an argument about identity rather than cryonics.

Bryan, let me offer the following - What do you find out of modern funerary arrangements to be your best chance for possible continuation-of-identity in the future, were you to die before significant advancements in longevity research were made? Assuming that you think it is even possible.

I think you would have to admit if not consider that while cremation affords you no chance at all, cryonics might. What percentage chance could that be expressed as? Some will insist it is 100%. I like to think that the chance is greater than 1%, which is better than the alternative.

Some of us advocates just want a chance, and that is sufficient.

Blackadder writes:

There is also the possibility that someone can upload Robin Hanson while he is still alive - for example, if a sufficiently powerful remote sensing technology was able to map his brain from something like an x-ray. Which would be the "real" Robin Hanson?

Robin would be Robin, and the simulation would be the simulation. What's so difficult about that.

Suppose you wake up in a mad scientist's lair strapped to a table. The mad scientist has cloned you and implanted the clone with all your memories. His plan is to kill you and have the clone assume your identity. I suppose from the perspective of Robin and some of the commenters here, this should be a matter of indifference. To paraphrase Robin, the mad scientist isn't killing you, just making you smaller! I think that's crazy. I'm me. If I'm going to be killed, the fact that there is a clone/imposture running around in my place isn't comforting. If anything, the fact that there will be an impostor makes the situation worse.

Ben Best writes:

Robin Hanson's desire for uploading via cryonics
is characteristic of only a small minority of
cryonicists. Most of us aspire to a restoration
of our bodies and brains in a condition rejuvenated
to the prime of life or better.

Cryonics does not require agonizing over the
nature of personal identity, beyond believing
that personal identity is in the brain. If we
have consciousness and personal identity as
a result of our brains, then to retain and
restore personal identity means we must
retain and restore our biological brains.

The feasibility of cryonics, then, depends
upon our current capacity to cryopreserve brains
and the capacity of future technology to
restore and rejuvenate cryopreserved brains.

To read about the scientific case for cryonics
see my article in Rejuvenation Research:

http://www.ncbi.nlm.nih.gov/pubmed/18321197

http://www.cryonics.org/reports/Scientific_Justification.pdf

For a general overview of cryonics, read my cryonics FAQ

http://www.benbest.com/cryonics/CryoFAQ.html

and the Wikipedia entry on the subject

http://en.wikipedia.org/wiki/Cryonics

-- Ben Best, President, Cryonics Institute

James D. Miller writes:

Bryan, were YOU alive 20 years ago?

If the answer is yes then you don't define YOU by the atoms or cells in your body (which were all different 20 years ago), but by the general pattern of your body. So if this general pattern is preserved by an upload you must still exist.


Eric H writes:

I am unsure whether Robin would have a problem with one of the tricks in The Prestige.

steve writes:

I agree with you in a sense Bryan. Suppose science really took a quontum leap and could make perfect clones of people right down to the atomic level including memories.

I would consider these clones copies of me. I expect that faced with the original and well documented proof the clones would accept this definition as well.

This is little different from the fact that a digital music recording is a copy and not the original music. However, in practice, this fine distinction makes little difference.

A close copy still has significant value to anyone that valued the original item even when the original item no longer exists. And I strongly suspect, even when that original item is yourself.

Ricky writes:

Ha! This is all predicated on the idea of a digital existence will be a satisfactory existence and the actual fulfilling thing about life is being yourself. The jury is out on that one. Personally being reincarnated in a simulated format disconnected from the context I exist in sounds like it sucks really really bad. 10K years from now I might be a doddy old man who is a computer program no one is interested in! What if the singularity happens and my thought patterns are so outdated I have no value to other people?

Let's be honest Epicurus would be a bore if we had him as a computer program today. We'd stick him in an old-folks server and roll our eyes when he tried to make parallels to Hellenic ideas on message board ghettos to the dead-headed un-hip.

M. writes:

I do not know much about the cryonics but I have a question related to the identity of the upload.

Does the upload have a future? Does its personality develop through new experience?

Isn't the "program" running in the brain dependent on the physical processes of the body, such as hormonal processes, etc. For example, if I upload the data of the depressed person right before committing a suicide - under the assumption that the depression was a result of the hormonal imbalances which are not present in the computer - would the upload develop a different future from the original person if they went through exactly same experience?

Elf Sternberg writes:

I'm still trying to figure out the objection here.

What relationship does Bryan Caplan of today have to the Bryan Caplan of 2011? Absolutely none, other than the general impression that, barring disaster, there will be a Bryan Caplan of 2011, just as the Bryan Caplan of right now has only memories and a compulsive belief that he is the same Bryan Caplan as that of 2007.

There are three coherent philosophies of identity: material (as long as some piece of my body maintains coherency, I exist), psychological (as long as something exists that coherently believes itself to be me, I exist), and spiritual (there is a nebulous and indeterminable thing which, as long as it "exists"-- despite any inability to demonstrate its existence-- I exist).

Robin subscribes to one (psychological), as do I, and Bryan subscribes to another. This belief puts intense moral weight to agency and identity: as long as there is something that is "me," for a definition of "me" sufficiently satisfying to "me," with sufficient agency to accomplish what I will to accomplish, then I am satisfied that I exist in some meaningful sense. Individuality is a limitation, not a universal absolute.

Staying Alive is a brief (three question) quiz from Philsopher's Magazine which lets you figure out if you have a coherent philosophy of continuity, or not. I wonder how Mr. Caplan would score.

Jacqueline writes:

You guys are having way too much fun with each other over there at GMU. I bet the economists at all the other departments are jealous.

Michael Nielsen writes:

You're mistaking the atoms in your body for you. Those atoms aren't you --- they're constantly being replenished, so today's neurotransmitter comes from yesterday's lunch, while last week's neurotransmitter is gone, gone, gone. We're patterns, not permanent structures, and patterns can move from hardware platform to hardware platform. We're making that migration constantly.

Blackadder writes:

Bryan, were YOU alive 20 years ago?

If the answer is yes then you don't define YOU by the atoms or cells in your body (which were all different 20 years ago), but by the general pattern of your body.

I'm guessing that the general pattern of Bryan's body was different 20 years ago too. Beside which, if you think that it's the general pattern of one's *body* that preserves personal identity, then clearly a computer "upload" of your personality wouldn't be you, no?

Rafal Smigrodzki writes:

Bryan, if you were to decide that the continued existence of your writings, Y-chromosome, or your favorite teddy bear is sufficient to satisfy your desire for eternal life because of these objects having your personal identity, then indeed you would not benefit from cryonics. This would also make you a very, very strange person, although still able to have a useful, fulfilling life, therefore not really insane. Personal identity, like consciousness, is self-referential (but nor circular, not "Illogical" as you exclaimed in discussion with Robin), and therefore there are as many valid definitions of personal identity as there are conscious humans.

In the case of people like Robin or me, "personal identity" is simply our indexical information embedded in a material object. By indexical information we define the ego-syntonic memories and proclivities that reliably differentiate between each of us and all other actually or potentially existing humans. If a material object (whether a human brain or a silicon computer) contains my indexical information, then it is a part of me. If it processes information using the thought routines that are indexical of me (i.e. differ from all other humans and sentients, and are sufficiently similar to me to pass muster as "me"), then it is a conscious version of me. Since my writings contain only an infinitesimally small fraction of my indexical information, they are not "me". Same pertains to my Y-chromosome.

Uploading satisfies this indexical information criterion, therefore it satisfies my desire to live. As noted above, YMMV, since personal identity is self-referential and therefore individual, not objective. And frankly, there is no logical argument that would inevitably lead you to our position, just as no logical argument can change a lion into an eagle - they are just different.

Adrienne Barbeaubot writes:

Be a hitter, babe.

Blackadder writes:

In the case of people like Robin or me, "personal identity" is simply our indexical information embedded in a material object.

If you upload your "indexical information" onto a machine then it will cease to be indexical to you, and thus wouldn't be you under your own definition.

Robin Hanson writes:

I respond here.

ajb writes:

Why do I always find Bryan most persuasive when he is arguing against the more extreme Libertarian/futurist/anti-traditional position and in favor of common sense? Ha ha.

Jason Malloy writes:

"If the answer is yes then you don't define YOU by the atoms or cells in your body (which were all different 20 years ago), but by the general pattern of your body. So if this general pattern is preserved by an upload you must still exist."

No, he is defining YOU by your own private, conscious experience of self, not by your atoms and cells. Duplicating your experience of self is not the same as prolonging your own private experience of self.

If I made an exact clone of you right now, with all your same memories up to this point, you and your clone would then be two entirely separate beings, with two separate internal experiences of self. Neither one of you would volunteer to die, because you are two entirely separate people, in the exact same sense that identical twins, or you and I, are separate people.

Immortality through a clone, is just as metaphorical as immortality through kids or immortality through your legacy.

Byran, it's not quite clear to me where your objection lies, so let me ask...

Do you think that physical systems other than networks of neurons (say, silicon and transistors) are incapable of supporting conscious experiences?

If your old brain was destroyed and a new one created, neuron by neuron, in the same configuration do you think the new brain would "be you" in the sense you deem relevant?

kebko writes:

When you wake up each morning, aren't you really just a simulation of your previous-evening's self? The only reason you wake up as you is because you share the memories of the you that went to sleep the night before. If someone secretly replaced your body in the night with a replacement body that was a good enough simulation that you didn't notice, would it matter to you? If they told you a month later that they had switched your body, would you scream, "Ah, I haven't been myself all month!"?

Alan Crowe writes:

If you want the opposite to Robin's view, read A problem with cryonics. I deny proposition number five of the philosphy of mind page that you link to.

I found the justification for proposition five remarkably weak. The page claimed it was a reasonable proposition because it was part of a pattern of other statements that the page claims are obviously true, but which I also doubt.

"you can't derive a statement describing distances from any set of statements that don't describe distances."

No. This is contradicted by Urysohn's metrization theorem which guarantees that if you have a normal topological space with a countable base you can put a metric on it and start talking about distances.

"You can not derive a statement about colors from any set of non-color statements."

No, to the statement "This LED emits at around 555nm" we can reply "Oh, you mean its a green one!"

"You can't derive geometrical statements from non-geometrical ones."

No. If a manifold has an affine connection we don't have a metric and cannot do geometry. However if the affine connection is symmetric in its lower two indices it is a metric connection. We can get a metric out of it and start doing geometry based on non-geometry.

"In the same way, it is a conceptual truth that you cannot derive a mental description from a physical description."

Obviously I think that overcoming technical difficulties would permit the derivation of a mental description from a physical description.

Blackadder writes:

When you wake up each morning, aren't you really just a simulation of your previous-evening's self? The only reason you wake up as you is because you share the memories of the you that went to sleep the night before.

No. For example, suppose I wake up with amnesia (or, more likely, I wake up having forgotten something that I remembered the night before). I don't share the memories of the me that went to sleep the night before, yet I'm still me.

If someone secretly replaced your body in the night with a replacement body that was a good enough simulation that you didn't notice, would it matter to you?

I'm not sure I understand. How would one replace my body? I understand how you could replace me with a duplicate, but how would one go about replacing my body with a duplicate while leaving me undisturbed?

Anton Tykhyy writes:

Jason, kebko: agree entirely. More here.

Anton Tykhyy writes:

Blackadder: but you would have memories of the you at some stage, before amnesia sets in; then the you that woke up would feel continuous with the you before onset of amnesia. After all, if you woke up without any memories (supposing also that they cannot be restored), what would it mean to say that you are you? In a trivial sense, you would be you because you definitely wouldn't be me or any other person, but that's about as far as it goes.

Robin Hanson writes:

I disagree that I like cryonics mainly because I have a more generous concept of who I am that Bryan. Instead I just consider my self concept to be less arbitrarily physical than Bryan. As long as Bryan stayed in the same sort of physical body, I'd bet he'd still think that future person was he no matter how much he'd changed mentally, as long as any big change came about as the result of many small voluntary changes.

Instead, I could imagine changes big enough that I wasn't me anymore. In fact, I'm not at all sure I'm the same person I was when I was twenty. But if a creature wakes up in a few minutes remembering everything that I remember, and having the same feelings and habits of thought that I do, well that guy is me. Doesn't matter if there are a hundred of them spread across the galaxy, made out of a hundred exotic materials. They are all me, because they all remember being exactly me now and they haven't changed much. If they then experience and change and diverge, well eventually some of them won't be me anymore.

Mikedc writes:

Would you agree or disagree with the idea that having (for lack of a better term) a "chain-of-consciousness" is essential for life?

That is, suppose I were cloned or uploaded, but I, MikeDC, remain alive as I've always been within this body. I wouldn't consider my clones or uploads to be me. We may share unique memories and capabilities up to the moment of upload, but after that we grow apart. We're separate instances of the same program.

My instance of the program is the only one I care about as "my life". I might think it'd be nice to have other instances of myself out there, but it'd be in the same way it'd be nice to have an identical twin. At the end of the day, a copy would be another conscious, sentient life. Not my own.

Thus, if immortality were a movement of my conscious mind from my living (but perhaps soon to no longer be) body to a computer simulation, I'd certainly consider myself to still be alive.

Perhaps problematically from a philosophical perspective, I think I must consider myself dead if (as in your hypothetical above), the power were totally shut off to my brain, and I was then I was totally "restarted". I don't know enough about medicine to know for sure, but I'm not sure that's possible. My understanding is that much of what is "me" is stored in volatile ram, and even if my body were brought back to life, much of that would be lost. Even were I fully the same, I'd think in some way I'm a new instance, rather than the same one.

Sigh.

Yes, discussions about cryonics often disintegrate into Personal Identity Wars Part XXIV. It's not even all that large of a factor in the cryonics version of the Drake Equation. If you're that attached to your "particular atoms", then the kind of advanced molecular nanotechnology required to do cryonics revival at all, could easily enough rebuild each neuron out of the "original atoms" that went into it.

Now it so happens that this is, knowably, unnecessary labor. That's a very long story. And I went into it in full detail on Overcoming Bias (now at Less Wrong). It so happens that we live in a universe where, by a stroke of good fortune, it is very easy to see that personal identity cannot possibly go along with a particular set of atoms - since standard physics explicitly rules out the possibility that atoms can possess persistent individual identities, in principle. It would be like trying to say which factor of 3 is which in the number 18.

And if you learn the basic ontology of standard quantum mechanics well enough to understand the notion of "configuration space" and "identical particles" and what is implied by this, you can actually see - not just be told - that personal identity can't possibly follow particular atoms. You can see that if someone copies your brain, there is not an "original" and a "copy" but rather two originals, or simply two of you. You can see that cryonics preserves exactly everything about you that is preserved by going to sleep one night and waking up the next morning.

But that is a rather long story, and it doesn't even matter all that much in the cryonics version of the Drake Equation.

As linked to in the Cryonics page on the Less Wrong Wiki, the long story is told in the sequence Quantum Mechanics and Personal Identity, of which the crowning post is Timeless Identity.

Jason Malloy writes:

"But if a creature wakes up in a few minutes remembering everything that I remember, and having the same feelings and habits of thought that I do, well that guy is me."

Robin, it isn't you because you are you. There can't be more than one you. There can be clones of you, but their internal experience is necessarily separate from yours.

Death is about the loss of conscious self, and this is the literal death that most people wish to prevent with medical technologies. The continuation of self in similarity is metaphorical immortality, be it through clones, legacy, or children.

In fact, if preserving similarity is your primary goal, then legacy is, no doubt, a vastly better method than cloning or life extension. Plato remains completely static in the form of his biography and his written ideas, but as a person he would have changed dramatically over the millennia in his ideas, personality, and biography.

If Michael Jackson would have died at 25 his legacy would be very different. And that's just a few decades.

Jason Malloy writes:

In fact, I'm not at all sure I'm the same person I was when I was twenty... If they then experience and change and diverge, well eventually some of them won't be me anymore.

How similar does one's attitudes and tastes have to be to qualify as the same person? Keeping people the same despite constant changing experience seems, to me, to be a much greater technical problem than keeping them from physically decaying. By this definition of self, people are constantly "dying" throughout their lifetimes. It's very plausible to me that a Robin Hanson robot with 2 years of personal development in the alien year 4000 would be no more similar to Robin Hanson 2009, than, say, Bryan Caplan and Robin Hanson are today.

Why bother with tying to gain immortality with cloning and head freezing when your future self will "die" by quickly changing in tastes, experiences, and attitudes?

In fact, if changing intellectually or temperamentally is such a problem, isn't continued life a bigger detriment to this goal than eventual death?

MikeDC writes:

You can see that cryonics preserves exactly everything about you that is preserved by going to sleep one night and waking up the next morning.

Not sure why this is the case. When you fall asleep, your brain is still "powered up". When you die, it's like flipping off the power on your computer. You can flip on the switch again, and it'll boot up, but the information that was in RAM is lost.

In lieu of proof to the contrary, I tend to think that a lot of what makes us "us" is contained in RAM.

Anton Tykhyy writes:

> In lieu of proof to the contrary
Ask people who have had a clinical death experience, there's quite a lot of them around.

Pablo Stafforini writes:

if a creature wakes up in a few minutes remembering everything that I remember, and having the same feelings and habits of thought that I do, well that guy is me. Doesn't matter if there are a hundred of them spread across the galaxy, made out of a hundred exotic materials. They are all me, because they all remember being exactly me now and they haven't changed much.

Robin, I’m baffled by what you write here. Let's say there are as many distinct individuals psychologically continuous with you as there are letters in the alphabet. Are you really saying that A, B,..., and Z are each identical to you? If so, by the transitivity of identity it would follow that A = B = ... = Z. But that cannot be so, because by the indiscernibility of identicals it would further follow that whatever is true of A is true of B, C, etc. Suppose A is now thinking of eating broccoli. It would then follow that B, C, etc are thinking of eating broccoli. Or suppose that B is now lifting his left arm. It would follow that C, D, etc are now lifting their left arm. Of course, nothing of the sort follows.

Rather than insisting that these people must be you because their existence would preserve what you most deeply care about, you should drop the assumption that someone must be identical to you in order for your to have towards him the sort of intimate concern that people typically have only for themselves. If you believe psychological continuity is the relation that matters, what you should say is that you care about the survival of whomever is psychologically connected to you, regardless of whether you would be the person or people so connected.

MikeDC writes:

Well, it's not a perfect analogy, of course. The brain and the information it contains doesn't die immediately upon clinical death, which is when you stop breathing and circulating blood. This generally, but not always kills the brain.

Blackadder writes:

Prof. Hanson said: Instead, I could imagine changes big enough that I wasn't me anymore. In fact, I'm not at all sure I'm the same person I was when I was twenty.

If that's right, then doesn't the case for cryonics in terms of life extension collapse? After all, if you live another 20 years, then it won't matter whether the Robin Hanson of the future is uploaded one day, as this will be someone else.

Blackadder writes:

Eliezer Yudkowsky said: It so happens that we live in a universe where, by a stroke of good fortune, it is very easy to see that personal identity cannot possibly go along with a particular set of atoms

Does anyone actually believe that personal identity goes with a particular set of atoms? Sounds like a straw man.

Dan Weber writes:

Bryan posting about cryonics always brings out the crazies.

Although I suppose I would place value in having an android duplicate of me, even if he would not share my sense of self. He would have the same desires to take care of my family, and I place value on that.

I'm not sure that this value would persist for very long, though. Within a few generations I could easily imagine my android clone is taking care of his android family to the detriment of my organic descendants. And while that would surely be an interesting and valuable life form, I myself don't value it much.

This deals with Bryan's other post about the unknowables of global warming. Just how certain is it that cryonics will work? Just how possible is it that it could go very very wrong?

Ralph C. Merkle writes:

The 4th Quarter 2008 issue of Cryonics magazine (see http://www.alcor.org/cryonics/cryonics0804.pdf) describes a cryonics revival scenario using molecular nanotechnology (MNT).

This scenario does not involve uploading, it involves direct analysis and restoration of your existing cellular and molecular structures.

Cryonics is not about uploading. Cryonics is about preserving your life until future medical technology can revive you. The referenced article gives our best guess about what that future medical technology will look like. But whatever it looks like, it will be the main stream medical technology of the day (and will no doubt have gone through extensive clinical testing and be subject to a wide range of medical regulations and laws).

If I were terminally ill I would be delighted to have access to the medical technology of 2100 -- and that is what cryonics offers. I certainly would not reject it based on some abstract speculations today about the nature of the mind.

Chris Hibbert writes:

I was at Bryan Caplan's house day before yesterday (thanks Bryan!), and Cryonics did come up, and I pointed out that Robin and I were the only ones there who were signed up so I think I was a party to the conversation Bryan is speaking of. It's plausible that I said "those aren't interesting questions", so I'll try to expand.

I'd like to live a lot longer than a standard lifespan, and it's not obvious that healthy lifespan is going to grow fast enough to keep me going as long as I'd like, so I'm signed up with Alcor. I don't worry about whether I'll be revived using the same atoms that I currently use, or even the same ones I'm frozen with. I did pay extra for a "whole body" contract rather than a brain-only ("neuro") contract because I suspect that there's significant learning, and memory in the interaction between brain and body. I like my current body and the way it works with my brain. If it's quicker and easier to train a revived and repaired brain with a familiar body, I'll be glad of the extra investment.

If uploading is easier than revival when the technology is developed, and there's reason to suspect that revival will take a lot longer, then I'll be happy to be uploaded rather than revived. There won't be much point to it if the people who are revived and or uploaded before me don't think they are continuations of their previous incarnations with similar goals and tastes. I think the people with an interest in bringing others back (their friends, Significant Others, and descendants) won't consider the project to be a success if the results produce people unlike the people who were suspended, so I expect the chances of revival to be higher if it's producing people who are very much continuations of their previous instantiation.

I agree with Alan Crowe above. Mental phenomena emerge from physical phenomena; this doesn't mean that they're not explainable in terms of the fundamentals, just that there are simpler descriptions that live entirely in the mental realm.

It's the discussion about whether "I'm still me" that I find uninteresting, because it's circular and ungrounded. Until we get to the part of the discussion that some people have raised above, in which the terms are defined, and we actually talk about what we're trying to achieve, and what kinds of continuity are valuable, the conversation is vacuous. That's why I would have said the questions are uninteresting.

Ak Mike writes:

The cryogenic program for immortality is doomed if it depends on downloading your brain into a computer, for reasons discussed by Hubert Dreyfus in a series of papers and books written over the last forty five years. The essence of his argument is that imagining that an intelligence can be human outside of a human body and social system is an absurd form of dualism. A mind is not a form of spirit, but an aspect of your bodily functions.

Jody writes:

I think continuity (or the illusion thereof, if you swing that way) is the critical hangup to the concept of self viz a viz replicas and downloads. I know it's mine.

In the case of a download, it there was some transitional period where I incorporated both hard- and wet-ware (perhaps taking a while like in the GoBot mythology), then I think I would view the transition to all hard-ware as still being me due to the continuity. The question then becomes - how long does the transition have to be for "it" to be me? Or... what is the quantum of self?

Steve Sailer writes:

I also like Woody's other punchline:

"I don't want to live forever in my art. I want to live forever in my apartment."

Michael Turner writes:

Eliezer writes: "And if you learn the basic ontology of standard quantum mechanics well enough ...."

Oh, don't quantum any of your mechanics at me! That stuff is also clearly and totally wrong, because it defies common sense! QED. Just ask Bryan.

Rudd-O writes:

Robin advanced a good argument as to how you could extend your life, and all you do is mock something he didn't even claim.

Clearly you didn't understand what Robin said.

[Comment edited for rudeness.--Econlib Ed.]

Nathan Cook writes:

Suppose I kill someone, then step into a transporter which gathers all the information about me needed to make an exact copy, copies me, and then destroys me. The copy steps out of the booth next door. Should he go to jail?

(inspired by this excellent short)

Sonic Charmer writes:

The real question is still that whether a copy/replication philosophically 'counts' as you or not, why I should have strong feelings for such a replication and long for it/them to exist in the future is entirely unclear. This is simply not what most people mean when they wish for immortality or longer lives; identifying the latter with the former seems to rely on verbal sleight of hand:

"You say 'you' want immortality? Well I have a philosophical argument that says a certain type of computer program counts as 'you', so if I make that program, I've granted your wish!" No, you really haven't, not for most people.

In computerspeak, most people perceive themselves to be not merely the pattern/program, but this particular instantiation of the program. Robin/et al are then left insisting on philosophical grounds that they are 'wrong' to do so, which may be true on some level, but is irrelevant to peoples' desires - which are what they are.

Of course, perhaps Robin is simply different from most people, and genuinely feels that his immortality drive would be sated/satisfied by a replication. Good for him, although I think that's pretty strange (why does he care so much about setting up a future replication of himself? what for?), and it's simply not what most people mean by the concept of immortality or life extension.

Keith Henson writes:

At the practical level, when you need cryonics suspension, you have run out of other options.

Adrienne Barbeau-bot writes:

Stormy: okay, so what if i put my brain in a robot body, and then there was a war between robots and humans, which side would i be on?
Debbie: humans! you still have a human brain.
Sparks: but the humans would discriminate against you. You can't even vote!
Marco: man, we better not have to live on reservations, that would really chap my caboose!

On a more serious note, what if a mad scientist kidnaps you and makes an exact clone and then forgets which one is the clone? Which one is the real you and more importantly which one gets to sleep with your girlfriend??

Keith Henson writes:

" . . . more importantly which one gets to sleep with your girlfriend??"

Both of you. If they are really good copies how is she going to know?

The alternative is get the mad scientists to run off a copy of the girlfriend.

Then the only problem is who gets the bank account and investments?

Keith

Rafal Smigrodzki writes:

I wrote:

In the case of people like Robin or me, "personal identity" is simply our indexical information embedded in a material object.

Blackadder wrote:

If you upload your "indexical information" onto a machine then it will cease to be indexical to you, and thus wouldn't be you under your own definition.

Blackadder, think it through. If I upload my indexical information onto a machine, then the machine becomes a part of me. The information in the machine still forms an image that is unique to me, is present only in objects and devices that are a part of me (and what is me is self-referentially decided by routines encoded in the image itself). Read "I am a strange loop" by Douglas Hofstadter for an exposition of self reference in consciousness.

jw013 writes:

Cryonics is not about uploading. Cryonics is about preserving your life until future medical technology can revive you.

This future medical technology is also known as uploading. Who are you trying to fool with the idea that in the year 2100 your cracked brain will be defrosted, stitched together and pumped full of oxygen-rich fluid. Human rights groups at the time would never permit such cruelty when all the data in the brain can be easily transferred out of it's defective substrate.

If the person has explicitly declined uploading and wishes for biological recomposition instead, their request will be denied on the grounds of primal insanity and will nevertheless be uploaded. Once they are uploaded, I assure you they will be very quick to drop their objections against such a procedure.

Max More writes:

Although I do think that arguments about personal identity/continuity are important in cryonics discussions, unlike some commentators, I also agree with them that your arguments about Robin’s PI views do nothing to dismiss cryonics. Would you still reject cryonics if you thought it most likely that you would be revived in the very same (repaired) body that you inhabit immediately prior to legal death?

Perhaps my old pal Robin didn’t adequately explain the view of personal identity he was adopting. You say that “Fascination with technology crowds out not just philosophy of mind, but common sense.” In fact, cryonicists overall are remarkably sophisticated about both philosophy of mind and the philosophy of personal identity. My own PhD dissertation directly answers your worries: The Diachronic Self: Identity, Continuity, Transformation

I follow Derek Parfit for the most part in my account, especially on the point that what matters is not really personal identity but personal continuity. (If you split into two identical individuals, logically identity is lost, but you – what matters about being you – certainly don’t cease to exist.)

You cannot simply decide to identify yourself with just anything because not just anything will preserve the chain of psychological connections that is you. (As Zack M. Davis commented: “Disputes about the definition of what counts as really you get us nowhere; we should try to think about what actually happens, and then only afterward worry about what to call it.”) At the same time, a rebuilt brain – or an uploaded version of you – can indeed preserve those psychological connections and qualities.

If I'm whatever I define as me, why bother with cryonics? Why not "define myself" as my Y-chromosome, or my writings, or the human race, or carbon?

Because your writings, while an important part of your identity, are only a small of it. We'd really like to have more of you around.

The human race, Y-chromosomes, and carbon are not unique to "you". What is unique to you is the pattern that your neurons make.

Are you saying that if someone blew your biological head off with a shotgun, you'd still be alive?!

Suppose that you used your copy of the movie "Aliens" as a shotgun clay. Would this mean that the movie "Aliens" ceased to exist?

No, of course not. Your copy of "Aliens" was but a single instantiation of "Aliens". Millions of additional copies of "Aliens" exist on a variety of media. The essence of "Aliens" is a pattern of bits that represent a sequential pattern of light and sound. As long as that pattern exists in some form, on some media, the essence of "Aliens" would continue to exist.

Likewise, the Robin you know is but a single instantiation of the pattern made up by the neurons in his brain. As it happens, that pattern is currently instantiated on a biological substrate. That substrate is constantly changing and self-modifying, and difficult to duplicate. But that doesn't it couldn't be duplicated, in principle, nor instantiated on another substrate.

Therefore, if Robin1 got his head shot off, the information it stored since Robin1 was uploaded would be lost, but most of the essence of Robin1 would continue to live on in Robin2.

There is no sharp boundary between "success" and "failure" when it comes to cryonics. Rather, there is a continuum of "success", with preserving your writing at one end, and restoring you so perfectly that not even your own mother would be able to detect any difference at the other.

In any case, cryonics doesn't require that you believe the above to be useful. Robin may have a more flexible definition of identity than you do, but that doesn't mean that you have to adopt it. You could demand that you only want to be revived if your current biological body can be repaired and restored to its current state.

Jessica writes:

If you replaced one neuron in your brain with a silicon-based neuron that is capable of all the same functions as the carbon-based one was, I imagine that you still think you are you. Imagine doing this slowly, until only silicon-based neurons are left. At what point do you magically become a different person?

Brian writes:

Mr. Caplan, it seems to me that by saying your arguments are "common sense" is to side-step the debate. Still, you've written a few points that I can look at.

We cannot simply choose to or try to identify with more durable things if our experiences suggest to us that our identity arises from thoughts and sensations which we have good reason to believe are processed in our brains. But of course you know this so your suggestion seems disingenuous.

It's only a matter of semantics and therefore pointless to doubt whether an accurate simulation would really be you. It only seems to be an important question because in your mind the adjective "merely" fits nicely in front of the word "simulation". For me, an accurate simulation is a high and satisfying threshold by which to recognize success. To discuss this matter further would give semantics unwarranted importance so this thought ends here.

If what I believe to be me is exactly what can be run in a simulation then I'd pay for the chance to have it done. How's that for avoiding semantics.

Oh, and if you blow my head off with a shotgun, there'd be one less of me.

Comments for this entry have been closed
Return to top