You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

nick012000 comments on Greg Egan disses stand-ins for Overcoming Bias, SIAI in new book - Less Wrong Discussion

35 Post author: Kaj_Sotala 07 October 2010 06:55AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (40)

You are viewing a single comment's thread. Show more comments above.

Comment author: XiXiDu 07 October 2010 10:22:36AM *  15 points [-]

By the way, a quote from Greg Egan is one of the highest voted comments on LW:

You know what they say the modern version of Pascal's Wager is? Sucking up to as many Transhumanists as possible, just in case one of them turns into God. -- Julie from Crystal Nights by Greg Egan

I wonder if Greg Egan actually knows how serious the people here take this stuff. You could actually write a non-fiction book about the incidents and beliefs within this community and tell people it is science fiction and they would review it as exaggerated fantasy. I guess that means that the Singularity already happened when it comes to the boundaries between what is factual, realistic, fiction, science fiction, fantasy, not even wrong or plain bullshit.

Comment author: nick012000 22 October 2010 06:53:06PM *  0 points [-]

What the heck was up with that, anyway? I'm still confused about Yudkowsky's reaction to it; from what I've pieced together from other posts about it, if anything, attracting the attention of an alien AI so it'll upload you into an infinite hell-simulation/use nanobots to turn the Earth into Hell would be a Good Thing, since at least you don't have to worry about dying and ceasing to exist.

Even if posting it openly would just get deleted, could someone PM me or something? EDIT: Someone PMed me; I get it now. It seems like Eleizer's biggest fear could be averted simply by making a firm precommitment not to respond to such blackmail, and thereby giving it no reason to commit such blackmail upon you.

Comment author: billswift 30 October 2010 06:23:21PM 9 points [-]

Simply? Making firm commitments at all, especially commitments believable by random others, is a hard problem. I just finished reading Schelling's Strategies of Commitment so the issue is at the top of my mind right now.