BerryPick6 comments on If it were morally correct to kill everyone on earth, would you do it? - Less Wrong

-6 Post author: Bundle_Gerbe 30 January 2013 11:58PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (43)

You are viewing a single comment's thread.

Comment author: BerryPick6 01 February 2013 12:10:53AM 0 points [-]

social consequences aside, is it morally correct to kill one person to create a million people who would not have otherwise existed?

How would a world in which it is morally correct to kill one person in order to create a million people look different than a world in which this is not the case?

Comment author: MugaSofer 01 February 2013 02:56:28PM -2 points [-]

Friendly AIs would behave differently, for one thing.

Comment author: BerryPick6 01 February 2013 03:03:35PM 0 points [-]

You may have to be a bit more specific. What in the FAI's code would look different between world 1 and world 2?

Comment author: MugaSofer 19 February 2013 02:03:08PM -2 points [-]

Define "moral" as referring to human ethics, whatever those may be. Define "Friendly" as meaning "does the best possible thing according to human ethics, whatever those may be." Define AI as superintelligence. Any Friendly AI, by these definitions, would behave differently depending on whether X is "moral".

Does that answer your question?