You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Eugine_Nier comments on Students asked to defend AGI danger update in favor of AGI riskiness - Less Wrong Discussion

3 Post author: lukeprog 18 October 2011 05:24AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (38)

You are viewing a single comment's thread.

Comment author: Eugine_Nier 19 October 2011 06:10:06AM 4 points [-]

This post seems to serve no purpose except to promote the dark arts.

Comment author: PhilGoetz 21 October 2011 04:41:00PM 2 points [-]

A quote from the PDF:

Also, wherever possible I tried to choose material that was freaky. The Big Dog video is particularly good example, as lots of people seem to find Big Dog freaky. Of course, at no point did I comment on the freakiness. I did not want my students to think that I wanted to unsettle them. I simply wanted them to experience their own natural reactions as they witnessed the power of artificial intelligence unfolding in front of them.

I then played two clips from the movie Terminator 2: Judgment Day. The first clip, from the very beginning of the movie, showed the future war between humans and robots. The second clip showed John Connor, Sarah Connor and the Terminator discussing the future of humanity and how the artificial intelligence Skynet was built. I chose the first clip in order to vividly present the image of an AGI catastrophe. I chose the second clip in order to present the following pieces of dialogue. ... These were the most ominous and portentous bits of dialogue I could find.

So, yes, dark arts. But the way he kept asking "And how would the AI do that?" was excellent.