In general, the ethical theory that prevails here on Less Wrong is preference utilitarianism. The fundamental idea is that the correct moral action is the one that satisfies the strongest preferences of the most people. Preferences are discussed with units such as fun, pain, death, torture, etc. One of the biggest dilemmas posed on this site is the Torture vs. Dust Specks problem. I should say, up front, that I would go with dust specks, for some of the reasons I mentioned here. I mention this because it may be biasing my judgments about my question here.
I had a thought recently about another aspect of Torture vs. Dust Specks, and wanted to submit it to some Less Wrong Discussion. Namely, do other people's moral intuitions constitute a preference that we should factor into a utilitarian calculation? I would predict, based on human nature, that a if the 3^^^3 people were asked if they wanted to inflict a dust speck in each one of their eyes, in exchange for not torturing another individual for 50 years, they would probably vote for dust specks.
Should we assign weight to other people's moral intuitions, and how much weight should it have?
The real answer to torture vs. dust specks is to recognize that the answer to the scenario is torture, but the scenario itself has a prior probability so astronomically low that no evidence could ever convince you that you were in it, since at most k/3^^^3 people can affect the fate of 3^^^3 people at once (where k is the number of times a person's fate is affected). However, there are higher-probability scenarios that look like torture vs. 3^^^3 dust specks, but are actually torture vs. nothing or torture vs. not-enough-specks-to-care. In philosophical problems we ignore that issue for simplicity and assume the problem statement is true with probability exactly 1, but you can't do that in real life, and in this case intuition sides with reality.
Therefore, answer dust specks, but build theories as though the answer were torture.
The same points in Pascal's Mugging apply. 3^^^3 has a relatively low K-complexity, which means that, if someone where to just tell you it happens, the expected number would still be astronomical.
There are higher-probability things that actually apply. They're just more like torture vs. significantly less torture. The bias is still enough to keep the paradox strong.