TheOtherDave comments on Timeless Identity - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (234)
Hrm.. ambiguous semantics. I took it to imply acceptance of the idea but not elevation of its importance, but I see how it could be interpreted differently. And yes, the rest of the post addresses something completely different. But if I can continue for a moment on the tangent, expanding my comment above (even if it doesn't apply to the OP):
You actually continue functioning when you sleep, it's just that you don't remember details once you wake up. A more useful example for such discussion is general anesthesia, which shuts down the regions of the brain associated with consciousness. If personal identity is in fact derived from continuity of computation, then it is plausible that general anesthesia would result in a "different you" waking up after the operation. The application to cryonics depends greatly on the subtle distinction of whether vitrification (and more importantly, the recovery process) slows downs or stops computation. This has been a source of philosophical angst for me personally, but I'm still a cryonics member.
More troubling is the application to uploading. I haven't done this yet, but I want my Alcor contract to explicitly forbid uploading as a restoration process, because I am unconvinced that a simulation of my destructively scanned frozen brain would really be a continuation of my personal identity. I was hoping that “Timeless Identity” would address this point, but sadly it punts the issue.
Like TheOtherDave (I presume), I consider my identity to be adequately described by whatever Turing machine that can emulate my brain, or at least its prefrontal cortex + relevant memory storage. I suspect that a faithful simulation of just my Brodmann area 10 coupled with a large chunk of my memories would restore enough of my self-awareness to be considered "me". This sim-me would probably lose most of my emotions without the rest of the brain, but it is still infinitely better than none.
There's a very wide range of possible minds I consider to preserve my identity; I'm not sure the majority of those emulate my prefrontal cortex significantly more closely than they emulate yours, and the majority of my memories are not shared by the majority of those minds.
Interesting. I wonder what you would consider a mind that preserves your identity. For example, I assume that the total of your posts online, plus whatever other information available without some hypothetical future brain scanner, all running as a process on some simulator, is probably not enough.
At one extreme, if I assume those posts are being used to create a me-simulation by me-simulation-creator that literally knows nothing else about humans, then I'm pretty confident that the result is nothing I would identify with. (I'm also pretty sure this scenario is internally inconsistent.)
At another extreme, if I assume the me-simulation-creator has access to a standard template for my general demographic and is just looking to customize that template sufficiently to pick out some subset of the volume of mindspace my sufficiently preserved identity defines... then maybe. I'd have to think a lot harder about what information is in my online posts and what information would plausibly be in such a template to even express a confidence interval about that.
That said, I'm certainly not comfortable treating the result of that process as preserving "me."
Then again I'm also not comfortable treating the result of living a thousand years as preserving "me."