Wanting to want X is again distinct from believing that you want X. Perhaps you believe that you want to want X, but you don't actually want to want X, you want to want Y instead, while currently you want Z and believe that you want W. (This is not about conscious vs. subconscious, this is about not confusing epistemic estimates of values with the values themselves, whatever nature each of these has.)
(See also An Epistemological Nightmare; I'm not joking though.)
Good link. I agree with guarding against wrong epistemic estimates of values (good wording).
Our disagreement comes down to this (I think): "I want to want X" Is this
a) an epistemic estimate of a value
b) a value in itself, pattern matching "I want Y", with Y being "to want X"
Consider a LW reader saying "I want to be a more rational reasoning agent", when previously she did not (this does not fit "want to want", but is also stating a potentially new element of a utility function potentially at odds with the t...
Many people see themselves in various groups (member of the population of their home country, or their social network), and feel justified in caring more about the well-being of people in this group than about that of others. They will argue with reciprocity: "Those people pay taxes in our country, they are entitled to more support from 'us' than others!" My question is: Is this inconsistent with some rationality axioms that seem obvious? What often-adopted or reasonable axioms are there that make this inconsistent?