When you hear powerful evidence or arguments that should get you to revise your beliefs, not only do all sorts of cognitive biases fight the changes but so do the social factors of status and face saving. Perhaps I've long been a vocal proponent of X which implies Y, and you show me that Y isn't always true. It's very hard to just straight up admit "ok, I'm not a hardcore Xist anymore." There's a status loss in letting yourself be convinced.
For a long time I thought that I was stronger than this, that saving face only mattered as much as I let it matter. I wish I could freely admit when I've been convinced, but I often can't manage to. [1] Instead I'll finish a conversation defending my earlier beliefs and only later start acting on my new ones.
After a discussion where someone didn't admit to any change of mind, I'll often see them later having changed their behavior. So now if I'm trying to persuade someone I don't focus on securing verbal agreement. Instead I just try to be as convincing as possible, and notice if they come around later. [2]
(I also posted this on my blog)
[1] This is not a helpful trait: I'd like other people to let me know when I'm wrong or when they have evidence I'm not considering, but if they never get the satisfaction of knowing they've convinced me they may just feel like they've wasted their time, and not try in the future. So I'm working on it.
[2] Keeping people from feeling personally invested in one side or the other of an argument is probably also helpful: I understand discussions are much more likely to convince bystanders than participants.
To my thinking, this stance forfeits rational reflection where it really counts most. You're saying, if I understand you, that you respect people who change their opinions on factual matters, but not on questions of fundamental ethics. This seems to assume, among other things, that people's values are much more coherent than they are (leaving little leverage for change).
You lose much more status, it is true, when you re-evaluate your terminal values than your factual contentions. That just means the problems of self-confirmation are compounded in ethics, not that they should be ignored there. You can't be rational yet rigidly maintain your terminal values' immunity to rational argument.
Any argument that my terminal values should be one thing or another will itself be founded on certain assumed values. You can't start from a value-neutral position and get to a value system from there.
If rational argument alone is enough to cause a change in one's values, I can see only a few possibilities:
The changed values were instrumental values rather than terminal values. It makes perfect sense to modify instrumental values if one no longer believes th