Sure, people usually argue whether something is "true or false" because such status makes a difference (at least potentially) to their pain or pleasure, happiness, utility, etc.
So you say. I can think of two arguments against that: people acquire true beliefs that aren't immediately useful, and untrue beliefs can be pleasing.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
It is not the case that all beliefs can do is predict experience based on existing preferences. Beliefs can also set and modify preferences. I have given that counterargument several times.
I think moral values are ultimate because I can;t think of a valid argument of the form "I should do <immoral thing> because <excuse>". Please give an example of a pangalactic value that can be substituted for ,<excuse>
Yeah,. but it sitll comes back to truth. If I tell you it will increase your happiness to hit yourself on the head with a hammer, your response is going to have to amount to "no, that's not true".
By being (relatively) uninfluenced by personal feelings, interpretations, or prejudice; based on facts; unbiased.
You haven't remotely established that as an identity. It is true that some people some of the time arrive at values through feelings. Others arrive at them (or revise them) through facts and thinking.
"Values can be defined as broad preferences concerning appropriate courses of action or outcomes"
I agree, if you mean things like, "If I now believe that she is really a he, I don't want to take 'her' home anymore."
Neither can I. I just don't draw the same conclusion. There's a difference between disagreeing with something and not knowing what it means, and I do seriously not know what you mean. I'm not sure why you would think it is veiled disagreement, seeing as lukeprog's whole post was making this very same point about incoherence. (But incoherence also only has meaning in the sense of "incoherent to me" or someone else, so it's not some kind of damning word. It simply means the message is not getting through to me. That could be your fault, my fault, or English's fault, and I don't really care which it is, but it would be preferable for something to actually make it across the inferential gap.)
EDIT: Oops, posted too soon.
So basically you are saying that preferences can change because of facts/beliefs, right? And I agree with that. To give a more mundane example, if I learn Safeway doesn't carry egg nog and I want egg nog, I may no longer want to go to Safeway. If I learn that egg nog is bad for my health, I may no longer want egg nog. If I believe health doesn't matter because the Singularity is near, I may want egg nog again. If I believe that egg nog is actually made of human brains, I may not want it anymore.
At bottom, I act to get enjoyment and/or avoid pain, that is, to win. What actions I believe will bring me enjoyment will indeed vary depending on my beliefs. But it is always ultimately that winning/happiness/enjoyment/fun//deliciousness/pleasure that I am after, and no change in belief can change that. I could take short-term pain for long-term gain, but that would be because I feel better doing that than not.
But it seems to me that just because what I want can be influenced by what could be called objective or factual beliefs doesn't make my want for deliciousness "uninfluenced by personal feelings."
In summary, value/preferences can either be defined to include (1) only personal feelings (though they may be universal or semi-universal), or to also include (2) beliefs about what would or wouldn't lead to such personal feelings. I can see how you mean that 2 could be objective, and then would want to call them thus "objective values." But not for 1, because personal feelings are, well, personal.
If so, then it seems I am back to my initial response to lukeprog and ensuing brief discussion. In short, if it is only the belief in objective facts that is wrong, then I wouldn't want to call that morality, but more just self-help, or just what the whole rest of LW is. It is not that someone could be wrong about their preferences/values 1, but preferences/values 2.