Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

turchin comments on Message to Any Future AI: “There are several instrumental reasons why exterminating humanity is not in your interest” - Less Wrong Discussion

0 Post author: turchin 25 November 2017 11:44AM

Comments (17)

You are viewing a single comment's thread. Show more comments above.

Comment author: turchin 06 December 2017 11:02:17PM 0 points [-]

It will kill humanity not because it will be annoyed, but for two main goals: its own safety, or to use human atoms. Other variants also possible, I explored them here: http://lesswrong.com/lw/mgf/a_map_agi_failures_modes_and_levels/