In general, the ethical theory that prevails here on Less Wrong is preference utilitarianism. The fundamental idea is that the correct moral action is the one that satisfies the strongest preferences of the most people. Preferences are discussed with units such as fun, pain, death, torture, etc. One of the biggest dilemmas posed on this site is the Torture vs. Dust Specks problem. I should say, up front, that I would go with dust specks, for some of the reasons I mentioned here. I mention this because it may be biasing my judgments about my question here.
I had a thought recently about another aspect of Torture vs. Dust Specks, and wanted to submit it to some Less Wrong Discussion. Namely, do other people's moral intuitions constitute a preference that we should factor into a utilitarian calculation? I would predict, based on human nature, that a if the 3^^^3 people were asked if they wanted to inflict a dust speck in each one of their eyes, in exchange for not torturing another individual for 50 years, they would probably vote for dust specks.
Should we assign weight to other people's moral intuitions, and how much weight should it have?
We are in violent agreement (but I'm coming off worse!).
rstarkov suggested that people may have "utility functions" that don't take real values.
Endoself's comment "showed" that this cannot be, starting from the assumption that everybody has a preference system that can be encoded as a real-valued utility function. This is nonsense.
My non-disagreement with you seems to have stemmed from me not wanting to be the first person to say "order-type", and us making different assumptions about how various poster's positions projected onto our own internal models of "lists" (whatever they were).
You shouldn't have used the worlds "not relevant", that implied the statement had no important implications for the problem at all, rather than proving the (very relevant since the topic is ulilitarism) hidden assumption wrong for that set of people (unless they bit the bullet).