You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

DanielLC comments on Open thread, 14-20 July 2014 - Less Wrong Discussion

3 Post author: David_Gerard 14 July 2014 11:16AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (144)

You are viewing a single comment's thread. Show more comments above.

Comment author: DanielLC 19 July 2014 06:52:12PM 4 points [-]

I don't understand. Can you explain that more?

Comment author: [deleted] 04 August 2014 09:04:49AM *  2 points [-]

There exist certain ideas that are very dangerous to think. They make you vulnerable to harm at the hands of future super-intelligences. Such ideas aren't hard to come by, most civilizations stumble upon them. One of the ways to render them inner is to end your existence.

Comment author: DanielLC 04 August 2014 08:40:50PM 1 point [-]

You mean like the basilisk?

Getting rid of your species seems like going overboard. If you saved the children and raised them by robots, you'd be able to remove whatever dangerous memes you made.

Also, if that is a common reaction to the basilisk, then a fundamental assumption in it is the opposite of true. If your response is ignoring it or less, you have nothing to worry about.

Comment author: [deleted] 05 August 2014 08:57:13AM 1 point [-]

You mean like the basilisk?

Like a basilisk yes.