You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Clarity comments on Open thread, Oct. 19 - Oct. 25, 2015 - Less Wrong Discussion

3 Post author: MrMind 19 October 2015 06:59AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (198)

You are viewing a single comment's thread. Show more comments above.

Comment author: John_Maxwell_IV 19 October 2015 07:47:39AM 9 points [-]
Comment author: Clarity 19 October 2015 08:19:14AM 3 points [-]

Actually very high quality subreddit. I'm impressed.

Comment author: Soothsilver 19 October 2015 07:41:43PM 3 points [-]

I never realized how many people there are who say "it's a good thing if AI obliterates humanity, it deserves to live more than we do".

Comment author: OrphanWilde 19 October 2015 08:21:10PM 3 points [-]

On some level, the question really comes down to what kind of successors we want to create; they aren't going to be us, either way.

Comment author: ChristianKl 19 October 2015 10:53:20PM 5 points [-]

On some level, the question really comes down to what kind of successors we want to create; they aren't going to be us, either way.

That depends on whether you plan to die.

Comment author: OrphanWilde 19 October 2015 10:59:36PM 5 points [-]

If I didn't, the person I become ten thousand years from now isn't going to be me; I will be at most a distant memory from a time long past.

Comment author: Soothsilver 20 October 2015 01:13:23PM 2 points [-]

It will still be more "me" than paperclips.

Comment author: OrphanWilde 20 October 2015 01:29:54PM 3 points [-]

Than paperclips, yes. Than a paperclip optimizer?

Well... ten thousand years is a very, very long time.

Comment author: passive_fist 20 October 2015 02:52:27AM -1 points [-]

It's a perfectly reasonable position when you consider that humanity is not going to survive long-term anyway. We're either going extinct and leaving nothing behind, evolving into something completely new and alien, or getting destroyed by our intelligent creations. The first possibility is undesirable. The second and third are indistinguishable from the point of view of the present (if you assume that AI will be developed far enough into the future that no current humans will suffer any pain or sudden death because of it).

Comment author: Soothsilver 20 October 2015 01:12:54PM 5 points [-]

You might still want your children to live rather than die.