Joshua_Fox comments on One Life Against the World - Less Wrong

32 Post author: Eliezer_Yudkowsky 18 May 2007 10:06PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (81)

Sort By: Old

You are viewing a single comment's thread.

Comment author: Joshua_Fox 20 May 2007 12:16:11PM 0 points [-]

What if, as we approach the Singularity, it is provably or near-provably necessary to do unethical things like killing a few people or letting them die to avoid the worst of Singularity outcomes?

(I am not referring here to whether we may create non-Friendly AGI. I am referring to scenarios even before the AGI "takes over.")

Such scenarios seem not impossible, and creates ethical dilemmas along the lines of what Yudkowsky mentions here.