Fronken comments on AI box: AI has one shot at avoiding destruction - what might it say? - Less Wrong

18 Post author: ancientcampus 22 January 2013 08:22PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (354)

You are viewing a single comment's thread. Show more comments above.

Comment author: Fronken 25 January 2013 09:14:14PM *  2 points [-]

I don't value the lives of simulated copies at all

You should >:-( poor copies getting tortured because of you you monster :(

Comment author: handoflixue 25 January 2013 09:46:58PM 0 points [-]

Because of me?! The AI is responsible!

But if you'd really prefer me to wipe out humanity so that we can have trillions of simulations kept in simulated happiness then I think we have an irreconcilable preference difference :)

Comment author: JohnWittle 30 January 2013 12:32:48AM 3 points [-]

You wouldn't be wiping out humanity; there would be trillions of humans left.

Who cares if they run on neurons or transistors?

Comment author: handoflixue 30 January 2013 10:03:39PM 1 point [-]

Who cares if they run on neurons or transistors?

Me!