You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Risto_Saarelma comments on Your transhuman copy is of questionable value to your meat self. - Less Wrong Discussion

12 Post author: Usul 06 January 2016 09:03AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (140)

You are viewing a single comment's thread. Show more comments above.

Comment author: Risto_Saarelma 08 January 2016 06:02:22AM 1 point [-]

There is some Buddhist connection, yes. The moments of experience thing is a thing in some meditation styles, and advanced meditators are actually describing something like subjective experience starting to feel like an on/off sequence instead of a continuous flux. Haven't gone really deep into what either the Buddhist metaphysics or the meditation phenomenology says. Neuroscience also has some discrete consciousness steps stuff, but I likewise haven't gone very deep into that. Anyway,

I'm with them so far. Here's where I get off): All sentient beings are points of naked awareness, by definition they are identical (naked, passive), therefore they are the same, Therefore even this self does not matter, therefore thou shall not value the self more than others. At all. On any level. All of which can lead you to bricking yourself up in a cave being the correct course of action.

This is still up for grabs. Given the whole thing about memories being what makes you you, consciousness itself is nice but it's not all that. It can still be your tribe against the world, your family against your tribe, your siblings against your family and you and your army of upload copies against your siblings and their armies of upload copies. So I'm basically thinking about this from a kin altruism and a general having people more like you closer in your circle of concern than people less like you thing. Upload copies are basically way, way closer kin than any actual kin.

So am I a pattern theorist? Not quite sure. It seems to resolve lots of paradoxes with the upload thought experiments, and I have no idea about a way to prove it wrong. (Would like to find one though, it seems sorta simplistic and we definitely still don't understand consciousness to my satisfaction.) But like I said, if I sit down on an upload couch, I fully expect to get up from an upload couch, not suddenly be staring at a HUD saying "IN SIMULATION", even though pattern theory seems to say that I should expect each outcome with 50 % probability. There will be someone who does wake up in the simulation with my memories in the thought experiment, no matter which interpretation, so I imagine those versions will start expecting to shift viewpoints while they do further upload scans, while the version of me who always wakes up on the upload coach (by the coin-toss tournament logic, there will be a me who never experiences waking up in a simulation, and one who always does) will continue to not expect much. I think uploads are a good idea more because of the kin selection like reasons above rather than because I'm convinced it's a ticket to personal immortality.

I wouldn't give a damn about aliens taking my body and brain apart every time I sleep as long as they put it back together perfectly again though, so if that makes me a pattern theorist then yes.