Posts

Sorted by New

Wiki Contributions

Comments

Sorted by
MartinH00

I'm following a link from Pharyngula, and I don't have time to read the comments, my apologies if I'm repeating something

I think you're up against the sorites paradox, you are confusing apples and oranges in comparing torture to a dust speck and that there is no practical way to implement the computation you propose.

People who whine about the dust specks in their eyes will get an unsympathetic response from me - I care zero about it. People who have been tortured for a minute will have my utmost concern and sympathy. Somewhere along the line, one turns into the other, but in your example, a googleplex of zeroes is still zero.

Torture is qualitatively different from pain, say the pain of a debilitating disease. Torture involves the intentional infliction of suffering for the sake of the suffering, extreme loss of control, the absence of sympathy and empathy, extreme uncertainty about the future and so on. The mental impact of torture is qualitatively different from accidental pain.

Universal informed consent and shared risk would seem to my moral gut to be necessary preconditions to make me stomach this kind of utilitarian calculus.

So this large population that agrees that the occasional victim enhances the overall utility would share the risk of becoming the victim. In that scenario, how many people would accept the lifetime torture lottery ticket in exchange for a lifetime free of dust motes? Without knowing the answer to this question, they can't estimate their own risk.