Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

MugaSofer comments on Failed Utopia #4-2 - Less Wrong

52 Post author: Eliezer_Yudkowsky 21 January 2009 11:04AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (248)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: MugaSofer 30 May 2013 10:38:19AM -1 points [-]

I wish that the future will turn out in such a way that I would not regret making this wish.

Fixed that for you.

Comment author: fractalman 01 June 2013 07:49:52PM *  0 points [-]

Haven't you ever played the corrupt-a-wish game?

Wish granted: horror as the genie/ai runs a matrix with copy after copy of you, brute forcing the granting of possible wishes, most of which turn out to be an absolute disaster. But you aren't allowed to know that happens, because the AI goes..."insane" is the best word I can think of, but it's not quite corrrect...trying to grant what is nearly an ungrantable wish, freezing the population into stasis untill it runs out of negentropy and crashes...

Now that's not to say friendly AI can't be done, but it WON'T be EASY.
If your wish isn't human-proof, it probably isn't AI-safe.

Comment author: MugaSofer 04 June 2013 07:05:25PM 0 points [-]

Yes, I have. Saying "the genie goes insane because it's not smart enough to grant your wish" is not how you play corrupt-a-wish. You're supposed to grant the wish, but with a twist so it's actually a bad thing.

Comment author: fractalman 15 June 2013 04:42:41AM 0 points [-]

perhaps I didn't make the whole "it goes and pauses the entire world while trying to grant your wish" part clear enough...

Comment author: MugaSofer 15 June 2013 09:01:34PM *  0 points [-]

Trying and failing to grant the wish is not the same as granting it, but it's actually terrible.

Comment author: FourFire 08 September 2013 10:58:42AM 0 points [-]

If the AI can't figure out the (future) wishes of a single human being, then it is insufficiently intelligent, and thus not the AI you would want in the first place.

Comment author: CCC 04 June 2013 07:17:51PM 0 points [-]

The genie vanishes, taking with it any memory that you ever met a genie. Since you would not remember making the wish, and since you would see no evidence of a wish having been made, you would not regret having made the wish.