You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Eitan_Zohar comments on I need a protocol for dangerous or disconcerting ideas. - Less Wrong Discussion

3 Post author: Eitan_Zohar 12 July 2015 01:58AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (154)

You are viewing a single comment's thread. Show more comments above.

Comment author: Eitan_Zohar 13 July 2015 02:04:12AM *  -1 points [-]

For instance, with Dust Theory, you say that you gave it at most a 10% chance of being true, and it was paralyzing to you. This shouldn't be. First, you need to consider your priors and the evidence.

No, I rated the death outcome as having a 10% chance of being true. But now I rate it much lower.

How often in the past have you had the actual experience that Dust Theory suggests is possible and which you fear? What actual experiential evidence do you have to suggest that Dust Theory is true?

This:

It will be some kind of natural selection in dust world lines, which will result in more stable ones, and most likely I am already in such line. In this line dreaming is built such that it will not result in important shifts of reality. And it is true: dreaming is not unconsciousness state. I start to have dreams immediately than I fall asleep. So dreaming is built to be not interupting some level of consciousness.

Basically, the fact that we do it only a little bit accounts for our observations in ways that other cosmological theories can't.

For that matter, one of the common threads of your fears seems to be that "you" cease to exist and are replaced by a different "you" or that "you" die. But the truth is already the case that people are constantly changing. The "you" from 10 years ago will be made up of different atoms than the "you" 10 years from now by virtue of the fact that our cells are constantly dying and being replaced. The thoughts we have also change from moment to moment, and our brains adjust the strengths of the connections between neurons in order to learn, such that our past and future brains gradually diverge.

Er, you don't understand the problem. I was worried about my subjective self dying.

Comment author: Dentin 15 July 2015 04:03:40AM 2 points [-]

I suspect part of the issue here is that your concept of subjective self isn't constructed to be compatible with these kinds of thought experiments, or with the idea that reality may be forking and terminating all the time. I can say that because mine -is- compatible with such things, and as a result pretty much all of this category of problem doesn't even show up on my radar.

Assuming I had a magical copying device that could copy my body at a sufficient accuracy, I could:

  • use the copier to create a copy of myself, and as the copy I could do the household chores then self destruct to free up resources without worrying about my 'self' dying.

  • use the copier to create a copy of myself, then as the original go do the chores/self destruct without worrying about my 'self' dying.

  • if there was a resource conflict which required the destruction of a copy, I could decide that I was the 'least important' copy and self terminate without worrying about my 'self' dying.

When a person's sense of identity can do the above things, concerns about your dust scenario really don't even show up as relevant - it doesn't matter which timeline or state you end up in, so long as your self is active somewhere, you're good.

How would you treat the above situations?

Comment author: Eitan_Zohar 15 July 2015 04:36:17AM *  0 points [-]

use the copier to create a copy of myself, and as the copy I could do the household chores then self destruct to free up resources without worrying about my 'self' dying.

I wouldn't do it in the first place, since there's a fifty percent chance of me winding up doomed. But if the copy is already created than no, it would not be me dying.

use the copier to create a copy of myself, then as the original go do the chores/self destruct without worrying about my 'self' dying.

That is absolutely dying.

if there was a resource conflict which required the destruction of a copy, I could decide that I was the 'least important' copy and self terminate without worrying about my 'self' dying.

Same thing for this.

Comment author: Dentin 15 July 2015 03:28:27PM 1 point [-]

That's what I figured. If anything, I'd say that this is your core issue, not dust theory. Your sense of subjective self just doesn't map well onto what it's actually possible to do, so of course you're going to get garbage results from time to time.

Comment author: Darklight 13 July 2015 05:26:50PM 1 point [-]

I guess I don't understand then? Care to explain what your "subjective self" actually is?