In general, the ethical theory that prevails here on Less Wrong is preference utilitarianism. The fundamental idea is that the correct moral action is the one that satisfies the strongest preferences of the most people. Preferences are discussed with units such as fun, pain, death, torture, etc. One of the biggest dilemmas posed on this site is the Torture vs. Dust Specks problem. I should say, up front, that I would go with dust specks, for some of the reasons I mentioned here. I mention this because it may be biasing my judgments about my question here.
I had a thought recently about another aspect of Torture vs. Dust Specks, and wanted to submit it to some Less Wrong Discussion. Namely, do other people's moral intuitions constitute a preference that we should factor into a utilitarian calculation? I would predict, based on human nature, that a if the 3^^^3 people were asked if they wanted to inflict a dust speck in each one of their eyes, in exchange for not torturing another individual for 50 years, they would probably vote for dust specks.
Should we assign weight to other people's moral intuitions, and how much weight should it have?
I would probably go for specks here. Many people, I predict, are going to get an emotional high out of thinking that their sacrifice will prevent someone from being tortured (while scope insensitivity prevents them from realizing just how small 1/3^^^3 is). If they actually receive a dust speck in the eye in a short enough time interval afterwards, I think that they would have a significant chance of taking it to mean that specks was chosen instead of torture (and thus they would get more of a high after being specked, if they knew why.)
If you polled people in such a way that they wouldn't get the high, then your answer should be the same as before. Scope insensitivity still kicks in, and people will vote without understanding how big 3^^^3 is (I don't understand it myself -- the description of 'a 1 followed by a string of zeroes as long as the Bible' is just so mind-numbingly big I know I can't comprehend it.)