AllanCrossman comments on Ethics as a black box function - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (30)
Well, you have to understand what such a decision would actually look like. In order for a decision to truly maximize total welfare over all people, even as it "stabs your friends in the back", it would have to really increase total welfare, because this utility gain would have to at least cancel out the degradation of the value of friendship.
That is, if I expect my friendship with someone not to mean that they weight me higher than a random person in their utility function, friendship becomes less valuable, and an entire set of socially-beneficial activity enabled by friendship (e.g. lower cost of monitoring for cheating) contracts.
I think your hypothetical here has the same problem that presenting the true Prisoner's Dilemma has; in the true PD, it's hard to intuitively imagine a circumstance where utilities in the payoff matrix account for my compassion for my accomplice. Just the same, in the tradeoff your presented, it's hard to intuitively understand what kind of social gain could outweigh general degradation of friendship.
ETA: Okay, it's not that hard, but like with the true PD, such situations are rare: for example, if I were presented with the choice of "My twenty closest friends/loved ones die" vs. "All of humanity except me and my twenty closest die". But even then, if e.g. my friends have children not in the set of 20, it's still not clear that all of the twenty would prefer the second option!
I'm not sure I'm understanding properly. You talk as if my action would drastically affect society's views of friendship. I doubt this is true for any action I could take.
Well, all my point really requires that is that it moves society in that direction. The fraction of "total elimination of friendship" that my decision causes must be weighed against the supposed net social gain (other people's gain minus that of my friends), and it's not at all obvious when one is greater than the other.
Plus, Eliezer_Yudkowsky's Timeless Decision Theory assumes that your decisions do have implications for everyone else's decisions!