You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

fubarobfusco comments on What happens when your beliefs fully propagate - Less Wrong Discussion

20 Post author: Alexei 14 February 2012 07:53AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (78)

You are viewing a single comment's thread.

Comment author: fubarobfusco 15 February 2012 05:51:32AM 3 points [-]

I think when LWers say "raise the sanity waterline," there are two ideas being presented. One is to make everyone a little bit more sane. That's nice, but overall probably not very beneficial to FAI cause. Another is to make certain key people a bit more sane, hopefully sane enough to realize that FAI is a big deal, and sane enough to do some meaningful progress on it.

There's another possible scenario: The AI Singularity isn't far, but it is not very near, either. AGI is a generation or more beyond our current understanding of minds, and FAI is a generation or more beyond our current understanding of values. We're making progress; and current efforts are on the critical path to success — but that success may not come during our lifetimes.

Since this is a possible scenario, it's worth having insurance against it. And that means making sure that the next generation are competent to carry on the effort, and themselves survive to make it.

Cultivating a culture of rationality, awareness of existential risks, etc. is surely valuable for that purpose, too.

Comment author: Alexei 15 February 2012 04:55:05PM 0 points [-]

Good point, thanks.