Joshua_Fox comments on One Life Against the World - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (81)
What if, as we approach the Singularity, it is provably or near-provably necessary to do unethical things like killing a few people or letting them die to avoid the worst of Singularity outcomes?
(I am not referring here to whether we may create non-Friendly AGI. I am referring to scenarios even before the AGI "takes over.")
Such scenarios seem not impossible, and creates ethical dilemmas along the lines of what Yudkowsky mentions here.