You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

nick012000 comments on Greg Egan disses stand-ins for Overcoming Bias, SIAI in new book - Less Wrong Discussion

35 Post author: Kaj_Sotala 07 October 2010 06:55AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (40)

You are viewing a single comment's thread. Show more comments above.

Comment author: nick012000 22 October 2010 06:53:06PM *  0 points [-]

What the heck was up with that, anyway? I'm still confused about Yudkowsky's reaction to it; from what I've pieced together from other posts about it, if anything, attracting the attention of an alien AI so it'll upload you into an infinite hell-simulation/use nanobots to turn the Earth into Hell would be a Good Thing, since at least you don't have to worry about dying and ceasing to exist.

Even if posting it openly would just get deleted, could someone PM me or something? EDIT: Someone PMed me; I get it now. It seems like Eleizer's biggest fear could be averted simply by making a firm precommitment not to respond to such blackmail, and thereby giving it no reason to commit such blackmail upon you.

Comment author: billswift 30 October 2010 06:23:21PM 9 points [-]

Simply? Making firm commitments at all, especially commitments believable by random others, is a hard problem. I just finished reading Schelling's Strategies of Commitment so the issue is at the top of my mind right now.