Jiro comments on The Hidden Complexity of Wishes - Less Wrong

58 Post author: Eliezer_Yudkowsky 24 November 2007 12:12AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (121)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: RichardKennaway 03 February 2015 01:04:03PM 1 point [-]

Even when making requests of other people, they may fulfil them in ways you would prefer they hadn't. The more powerful the genie is at divining your true intent, the more powerfully it can find ways of fulfilling your wishes that may not be what you want. It is not obvious that there is a favorable limit to this process.

Your answers to questions about your intent may depend on the order the questions are asked. Or they may depend on what knowledge you have, and if you study different things you may come up with different answers. Given a sufficiently powerful genie, there is no real entity that is "how I interpret the wish".

How is the genie supposed to know your answers to all possible questions of interpretation? Large parts of "your interpretation" may not exist until you are asked about some hypothetical circumstance. Even if you are able to answer every such question, how is the genie to know the answer without asking you? Only by having a model of you sufficiently exact that you are confident it will give the same answers you would, even to questions you have not thought of and would have a hard time answering. But that is wishing for the genie to do all the work of being you.

A lot of transhumanist dreams seem to reduce to this: a Friendly AGI will do for us all the work of being us.

Comment author: Jiro 03 February 2015 04:44:02PM *  1 point [-]

Your answers to questions about your intent may depend on the order the questions are asked. Or they may depend on what knowledge you have, and if you study different things you may come up with different answers.

If I ask the genie for long life, and the genie is forced to decide between a 200 year lifespan with a 20% chance of a painful death and a 201 year lifespan with a 21% chance of a painful death, it is possible that the genie might not get my preferences exactly correct, or that my preferences between those two results may depend on how I am asked or how I am feeling at the time.

But if the genie messed up and picked the one that didn't really match my preferences, I would only be slightly displeased. I observe that this goes together: in cases where it would be genuinely hard or impossible for the genie to figure out what I prefer, the fact that the genie might not get my preferences correct only bothers me a little. In cases where extrapolating my preferences is much easier, the genie getting them wrong would matter to me a lot more (I would really not like a genie that grants my wish for long life by turning me into a fungal colony). So just because the genie can't know the answer to every question about my extrapolated preferences doesn't mean that the genie can't know it to a sufficient degree that I would consider the genie good to ask for wishes.