You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Will_Newsome comments on Rational Terrorism or Why shouldn't we burn down tobacco fields? - Less Wrong Discussion

-2 Post author: whpearson 02 October 2010 02:51PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (47)

You are viewing a single comment's thread. Show more comments above.

Comment author: Will_Newsome 03 October 2010 02:22:25AM *  0 points [-]

I don't really know the SIAI people, but I have the impression that they're not against AI at all. Sure, an unfriendly AI would be awful - but a friendly one would be awesome.

True. I know the SIAI people pretty well (I'm kind of one of them) and can confirm they agree. But they're pretty heavily against uFAI development, which is what I thought XiXiDu's quote was talking about.

And they probably think AI is inevitable, anyway.

Well... hopefully not, in a sense. SIAI's working to improve widespread knowledge of the need for Friendliness among AGI researchers. It's inevitable (barring a global catastrophe), but they're hoping to make FAI more inevitable than uFAI.

As someone who was a volunteered for SIAI at the Singularity Summit, a critique of SIAI could be to ask why we're letting people who aren't concerned about uFAI speak at our conferences and affiliate with our memes. I think there are good answers to that critique, but the critique itself is a pretty reasonable one. Most complains about SIAI are comparatively maddeningly irrational (in my own estimation).

Comment author: Larks 03 October 2010 07:41:13PM 1 point [-]

A stronger criticism, I think, is why the only mention of friendliness at the Summit was some very veiled hints in Eliezer's speech. Again, I think there are good reasons, but not good reasons that a lot of people know, so I don't understand why people bring up other criticisms before this one.