Grognor comments on Cryonics without freezers: resurrection possibilities in a Big World - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (129)
Isn't dissolving the concept of personal identity relatively straightforward?
We know that we've evolved to protect ourselves and further our own interests, because organisms who didn't have that as a goal didn't fare very well. So in this case at least, personal identity is merely a desire to make sure that "this" organism survives.
Naturally, the problem is defining in "this organism". One says, "this" organism is something defined by physical continuity. Another says, "this" organism is something defined by the degree of similarity to some prototype of this organism.
One says, sound is acoustic vibrations. Another says, sound is the sensation of hearing...
There's no "real" answer to the question "what is personal identity", any more than there is a "real" answer to the question "what is sound". You may pick any definition you prefer. Of course, truly dissolving "personal identity" isn't as easy as dissolving "sound", because we are essentially hard-wired to anticipate that there is such a thing as personal identity, and to have urges for protecting it. We may realize on an intellectual level that "personal identity" is just a choice of words, but still feel that there should be something more to it, some "true" fact of the matter.
But there isn't. There are just various information-processing systems with different degrees of similarity to each other. One may draw more-or-less arbitrary borders between the systems, designating some as "me" and some as "not-me", but that's a distinction in the map, not in the territory.
Of course, if you have goals about the world, then it makes sense to care about the information-processing systems that are similar to you and share those goals. So if I want to improve the world, it makes sense for me to care about "my own" (in the commonsense meaning of the word) well-being - even though future instances of "me" are actually distinct systems from the information-processing system that is typing these words, I should still care about their well-being because A) I care about the well-being of minds in general, and B) they share at least part of my goals, and are thus more likely to carry them out. But that doesn't mean that I should necessarily consider them "me", or that the word would have any particular meaning.
And naturally, on some anticipation/urge-level I still consider those entities "me", and have strong emotions regarding their well-being and survival, emotions that go above and beyond that which is justified merely in the light of my goals . But I don't consider that as something that I should necessarily endorse, except to the extent that such anticipations are useful instrumentally. (E.g. status-seeking thoughts and fantasies may make "me" achieve things which I would not otherwise achieve, even though they make assumptions about such a thing as personal identity.)
Nay, I don't think it is.
I don't take issue with anything in particular you said in this comment, but it doesn't feel like a classic, non-greedy Reduction of the style used to reduce free will into cognitive algorithms or causality into math.
The sense in which you can create another entity arbitrarily like yourself and say, "I identify with this creature based on so-and-so definition" and then have different experiences than the golem no matter how like you it is is the confused concept that I do not think has been dissolved; I am not sure if it a non-fake dissolving of it has ever even started. (Example: Susan Blackmore's recent "She Won't Be Me". This is clearly a fake reduction; you don't get to escape the difficulties of personal identity confusion by saying a new self pops up every few minutes/seconds/plancktimes. Your comment is less obviously wrong but still sidesteps the confusion instead of Solving it.)
Hell, it's not just a Confusing Problem. I'd say it's a good candidate for The Most Confusing Problem.
Edit (one of many little ones): I made this comment pretty poorly, but I hope the point both makes sense and got through relatively intact. Mitchell Porter's comment is also really good until the penultimate paragraph.
I tried responding to this example, but I find the whole example so foreign and confused in some sense that I don't even know how to make enough sense of it to offer a critique or an explanation. Why wouldn't you expect there to exist an entity with different experiences than the golem, and which remembers having identified with the golem? You're not killing it, after all.