Eliezer_Yudkowsky comments on Torture vs Dust Specks Yet Again - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (54)
Violating a coherence theorem always carries with an appropriate penalty of incoherence. What is your reply to the obvious argument from circular preference?
It would be utterly disastrous to create an AI which would allow someone to be slapped in the face to avoid a 1/3↑↑↑3 probability of destroying the Earth.
Suppose that I would tentatively choose to torture one person to save a googolplex people from dust specks, and that additionally I would choose torture to save only a googol people from a papercut. Do I have circular preferences if I would be much, much more willing to save a googolplex people from dust specks by giving paper cuts to googol people than to save either group from specks or paper cuts by torturing one person?
I can achieve the exact same total utility by giving specks to googolplex people, giving papercuts to a googol people, or torturing one person. If I had to save 3^^^3 people from dust specks I'd give 3^^^3*googol/googolplex people paper cuts instead of torturing anyone. I'd much prefer saving 3^^^3 people from dust specks by subjecting perhaps 2^^^2 people to a relatively troublesome dust speck. So why exactly do I prefer troublesome dust specks over papercuts over torture even if utility is maximized either way? I think that I'm probably doing utilitarianism as more of a maximin calculation; maximizing the minimum individual utility function in some way. I can't maximize total utility in the cases where additional utility for some people must be bought at the cost of negative utility for others; it requires more of a fair exchange between individuals in order to increase total utility.