Eisegates, is there no limit to the number of people you would subject to a punch in the face (very painful but temporary with no risk of death) in order to avoid the torture of one person? What if you personally had to do (at least some of) the punching? I agree that I might not be willing to personally commit the torture despite the terrible (aggregate) harm my refusal would bring, but I'm not proud of that fact - it seems selfish to me. And extrapolating your position seems to justify pretty terrible acts. It seems to me that the punch is equivalent to some very small amount of torture.
Salutator: "...unspoken premise, that a moral system should sort outcomes rather then actions, so that it doesn't matter who would do the torturing or speck-placing."
I fear that I'm missing something. Is this just another way of asking if I would pick up the blowtorch? If it's true, and you seem to agree, that our intuition focuses on actions over outcomes, don't you think that's a problem? Perhaps you're not convinced that our intuition reflects a bias? That we'd make better decisions if we shifted a little bit of our attention to outcomes? Or is the fear that "doing the math" will produce a more biased position than intuition alone? I think that you need to do the math to balance, not (in all cases) replace, your intuition. El's point was that this is particularly important when you scale because your intuition cannot be (blindly) trusted.
Eisegates, is there no limit to the number of people you would subject to a punch in the face (very painful but temporary with no risk of death) in order to avoid the torture of one person? What if you personally had to do (at least some of) the punching? I agree that I might not be willing to personally commit the torture despite the terrible (aggregate) harm my refusal would bring, but I'm not proud of that fact - it seems selfish to me. And extrapolating your position seems to justify pretty terrible acts. It seems to me that the punch is equivalent to some very small amount of torture.
Salutator: "...unspoken premise, that a moral system should sort outcomes rather then actions, so that it doesn't matter who would do the torturing or speck-placing."
I fear that I'm missing something. Is this just another way of asking if I would pick up the blowtorch? If it's true, and you seem to agree, that our intuition focuses on actions over outcomes, don't you think that's a problem? Perhaps you're not convinced that our intuition reflects a bias? That we'd make better decisions if we shifted a little bit of our attention to outcomes? Or is the fear that "doing the math" will produce a more biased position than intuition alone? I think that you need to do the math to balance, not (in all cases) replace, your intuition. El's point was that this is particularly important when you scale because your intuition cannot be (blindly) trusted.