You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Konkvistador comments on Open thread, 14-20 July 2014 - Less Wrong Discussion

3 Post author: David_Gerard 14 July 2014 11:16AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (144)

You are viewing a single comment's thread.

Comment author: [deleted] 18 July 2014 04:55:49PM 3 points [-]

Idea: The Great Filter as a self-imposed measure by sentient life to mitigate inevitable early thought experiment blunders in their histories.

Comment author: DanielLC 19 July 2014 06:52:12PM 4 points [-]

I don't understand. Can you explain that more?

Comment author: [deleted] 04 August 2014 09:04:49AM *  2 points [-]

There exist certain ideas that are very dangerous to think. They make you vulnerable to harm at the hands of future super-intelligences. Such ideas aren't hard to come by, most civilizations stumble upon them. One of the ways to render them inner is to end your existence.

Comment author: DanielLC 04 August 2014 08:40:50PM 1 point [-]

You mean like the basilisk?

Getting rid of your species seems like going overboard. If you saved the children and raised them by robots, you'd be able to remove whatever dangerous memes you made.

Also, if that is a common reaction to the basilisk, then a fundamental assumption in it is the opposite of true. If your response is ignoring it or less, you have nothing to worry about.

Comment author: [deleted] 05 August 2014 08:57:13AM 1 point [-]

You mean like the basilisk?

Like a basilisk yes.

Comment author: ZankerH 22 July 2014 11:08:22PM 1 point [-]

Idea: The concept of a great filter is a collective failure of imagination on the part of humanity, amplified by a severe lack of data.

Comment author: shminux 23 July 2014 01:12:33AM 0 points [-]

Yeah, I (and others) have been saying this here and elsewhere.