You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Unknowns comments on Open thread, Nov. 24 - Nov. 30, 2014 - Less Wrong Discussion

4 Post author: MrMind 24 November 2014 08:56AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (317)

You are viewing a single comment's thread.

Comment author: Unknowns 30 November 2014 03:50:54PM 0 points [-]

If there is a future Great Filter, it seems likely it would be one of two things:

1) a science experiment that destroys the world even though there was no reason to think that it would.

2) something analogous to nuclear weapons except easily constructable by an individual using easily obtainable materials, so that as soon as people have the knowledge, any random person can inflict immense destruction.

Are there any strategies that would guard against these possibilities?

Comment author: Izeinwinter 30 November 2014 07:33:43PM -1 points [-]

1: No. Well, in theory, an presence on the moons of neptune that could survive indefinately without contact would do it, but that's not going to happen any time soon.

2: Arguably, we already live in this world. There are very destructive things in the canon of human knowledge, only people don't conceptualize them as weapons at all, but merely as dangers to be avoided. So.. good news, this does not work as a filter, and the actually odd thing is that we *do think of runaway super criticality as a weapon. Conditioning by lots of wars to think of explosions as ways to kill people?

*I'm not going to name examples in this context, because that might theoretically "help" someone to think of said example as a weapon. Which would be bad.