ChristianKl comments on "Stupid" questions thread - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (850)
Because the AI is better at estimating the consequences of following an order than the person giving the order.
There also the issue that the AI is likely to act in a way that changes the order that the person gives if it's own utility criteria are about fulfilling orders.
Also, even assuming a “right” way of making obedient FAI is found (for example, one that warns you if you’re asking for something that might bite you in the ass later), there remains the problem of who is allowed to give orders to the AI. Power corrupts, etc.