shminux comments on Most Likely Cause of an Apocalypse on December 21 - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (22)
Why would it care to avoid inflicting pain? If it finds that extreme mental anguish and/or physical distress makes humans flail and screech in curious ways, it would have no reason to not repeat the observations over and over.
There are many FAI failure modes that don't involve gratuitous torture. I think 'could' is justified here, especially in comparison to Bioweapon catastrophe.
Right, there are plenty of failure modes, some less unpleasant than others, some are probably horrific beyond our worst nightmares. I suspect that any particular set of scenarios that we find comforting would have measure zero in the space of possible outcomes. If so, preferring death by AI over death by a bioweapon is but a failure of imagination.
It doesn't take much comfort to beat a bioweapon that actually succeeds in killing everyone.
Simply using our atoms to make paperclips, and being quick about it, wins.