JoshuaZ comments on Open Thread, May 16-31, 2012 - Less Wrong

4 Post author: OpenThreadGuy 16 May 2012 07:36AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (121)

You are viewing a single comment's thread. Show more comments above.

Comment author: JoshuaZ 22 May 2012 12:55:10AM 2 points [-]

Everyone falls into a coma where they get to control their own individual apparent reality. Meanwhile they all starve to death or run into other problems because nothing about the wish says they need to stay alive.

Comment author: RomeoStevens 22 May 2012 12:56:35AM 2 points [-]

Doesn't discontinuation of the sensory experience count as a lack of control?

Comment author: Desrtopa 22 May 2012 01:47:43PM 1 point [-]

Well, the wish doesn't say "give me the ability to control my sensory experience forever". If you die, your ability to control your body is discontinued, but that doesn't mean you couldn't control your body.

Comment author: RomeoStevens 22 May 2012 06:55:34PM 1 point [-]

can you expand a little on this?

Comment author: Desrtopa 22 May 2012 07:56:04PM 1 point [-]

Suppose that a person with locked-in-syndrome wished for voluntary control of their body. Their disorder is completely cured, and they gain the ability to control their body like anyone else. Would you say that their wish wasn't really granted unless they never die?

Comment author: RomeoStevens 22 May 2012 08:24:45PM 0 points [-]

personally yes, but I realize this is strange.

Comment author: JoshuaZ 22 May 2012 01:01:17AM 1 point [-]

Hmm, possibly. But everyone stuck in their own sensory setting with no connection to anyone else is still pretty bad.

Comment author: RomeoStevens 22 May 2012 01:22:52AM *  0 points [-]

You aren't necessarily stuck anywhere. How the statement "I want to talk to Brian" gets unpacked once the wish has been implemented depends on how "control" gets unpacked. Any statement we make about sensory experiences we wish to have involve control only on one conceptual level. We can't control what Brian says once we're talking to him, but we never specified that we wanted control over it either. I think that you wind up with a conflict where you ask for control on the wrong conceptual level, or two different levels conflict. I'm having trouble coming up with examples though.

Comment author: JoshuaZ 22 May 2012 01:49:59AM 1 point [-]

And if "I want to talk to Brian" is parsed that way doesn't that require telling Brian that someone wants to talk to him, which for at least a few seconds takes control away from Brian of part of his sensory input?

Comment author: RomeoStevens 22 May 2012 05:29:48AM *  1 point [-]

So a problem is that it would be impossible to know what options to make more obviously available to you. If the action space isn't screened off the number of options you have is huge. There's no way to present these options to a person in a way that satisfies "maximum control". As soon as we get into suggesting actions we're back to the problem of optimizing for what makes humans happy.

This is highly helpful BTW.