I don't expect that being systematically wrong about your own values would be desirable.
(See clarification in the grandparent)
Isn't your present self the determinant of your terminal values? The blueprint you compare against? Isn't it a tautology that your current utility function is the utility function of your present self?
If so, if at any one point in time you desire to reprogram a part of your own utility function, wouldn't that desire in itself mean that such a change is already a justified part of your present utility function?
If there is some tension between your conscious desires ("I want to feel this or that way about this or th...
Many people see themselves in various groups (member of the population of their home country, or their social network), and feel justified in caring more about the well-being of people in this group than about that of others. They will argue with reciprocity: "Those people pay taxes in our country, they are entitled to more support from 'us' than others!" My question is: Is this inconsistent with some rationality axioms that seem obvious? What often-adopted or reasonable axioms are there that make this inconsistent?