Posts

Sorted by New

Wiki Contributions

Comments

Sorted by

Premise: A person who wants to destroy the world with exceeding amounts of power would be just as big of a problem as an AI with exceeding amounts of power. 

Question: Do we just think that AI will be able to obtain that amount of power more effectively than a human, and that the ratio of AIs that will destroy the world to safe AIs is larger than the ratio of world destroying humans to normal humans?