You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

TheOtherDave comments on The impact of whole brain emulation - Less Wrong Discussion

3 Post author: jkaufman 14 May 2013 07:59PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (34)

You are viewing a single comment's thread. Show more comments above.

Comment author: TheOtherDave 21 May 2013 10:34:06PM *  0 points [-]

I can't imagine any mind would intentionally do that to themselves just to explore more sections of mind-space.

Mm. That's interesting. While I can't imagine actually arranging for my child to die in order to explore that experience, I can easily imagine going through that experience (e.g., with some kind of simulated person) if I thought I had a reasonable chance of learning something worthwhile in the process, if I were living in a post-scarcity kind of environment.

I can similarly easily imagine myself temporarily adopting various forms of theism, atheism, former-theism, and all kinds of other mental states.

And I can even more easily imagine encouraging clones of myself to do so, or choosing to do so when there's a community of clones of myself already exploring other available paths. Why choose a path that's already being explored by someone else?

It sounds like we're both engaging in mind projection here... you can't imagine a mind being willing to choose these sorts of many-sigmas-out experiences, so you assume a population of clone-minds would stick pretty close to a norm; I can easily imagine a mind choosing them, so I assume a population of clone-minds would cover most of the available space.

And it may well be that you're more correct about what clones of an arbitrarily chosen mind would be like... that is, I may just be an aberrant data point.

Comment author: Yosarian2 21 May 2013 11:01:04PM 0 points [-]

I can easily imagine a mind choosing them, so I assume a population of clone-minds would cover most of the available space.

Ok, so let's say for the sake of arguments that you're more flexible about such things then 90% of the population is. If so, would you be willing to modify yourself into someone less flexible, into someone who never would want to change himself? If you don't, then you've just locked yourself out of about 90% of all possible mindspace on that one issue alone. However, if you do, then you're probably stuck in that state for good; the new you probably wouldn't want to change back.

Comment author: TheOtherDave 21 May 2013 11:25:27PM *  0 points [-]

Absolutely... temporarily being far more rigid-minded than I am would be fascinating. And knowing that the alarm was ticking and that I was going to return to being my ordinary way of being would likely be deliciously terrifying, like a serious version of a roller coaster.

But, sure, if we posit that the technology is limited such that temporary changes of this sort aren't possible, then I wouldn't do that if I were the only one of me... though if there were a million of me around, I might.