In general, the ethical theory that prevails here on Less Wrong is preference utilitarianism. The fundamental idea is that the correct moral action is the one that satisfies the strongest preferences of the most people. Preferences are discussed with units such as fun, pain, death, torture, etc. One of the biggest dilemmas posed on this site is the Torture vs. Dust Specks problem. I should say, up front, that I would go with dust specks, for some of the reasons I mentioned here. I mention this because it may be biasing my judgments about my question here.
I had a thought recently about another aspect of Torture vs. Dust Specks, and wanted to submit it to some Less Wrong Discussion. Namely, do other people's moral intuitions constitute a preference that we should factor into a utilitarian calculation? I would predict, based on human nature, that a if the 3^^^3 people were asked if they wanted to inflict a dust speck in each one of their eyes, in exchange for not torturing another individual for 50 years, they would probably vote for dust specks.
Should we assign weight to other people's moral intuitions, and how much weight should it have?
What is your evidence for this? In The Preference Utilitarian’s Time Inconsistency Problem, the top voted comments didn't try to solve the problem posed for preference utilitarians, but instead made general arguments against preference utilitarianism.