Spoilers for mad investor chaos and the woman of asmodeus (planecrash Book 1).
The Watcher spoke on, then, about how most people have selfish and unselfish parts - not selfish and unselfish components in their utility function, but parts of themselves in some less Law-aspiring way than that. Something with a utility function, if it values an apple 1% more than an orange, if offered a million apple-or-orange choices, will choose a million apples and zero oranges. The division within most people into selfish and unselfish components is not like that, you cannot feed it all with unselfish choices whatever the ratio. Not unless you are a Keeper, maybe, who has made yourself sharper and more coherent; or maybe not even then, who knows? For (it was said in another place) it is hazardous to non-Keepers to know too much about exactly how Keepers think.
It is dangerous to believe, said the Watcher, that you get extra virtue points the more that you let your altruistic part hammer down the selfish part. If you were older, said the Watcher, if you were more able to dissect thoughts into their parts and catalogue their effects, you would have noticed at once how this whole parable of the drowning child, was set to crush down the selfish part of you, to make it look like you would be invalid and shameful and harmful-to-others if the selfish part of you won, because, you're meant to think, people don't need expensive clothing - although somebody who's spent a lot on expensive clothing clearly has some use for it or some part of themselves that desires it quite strongly.
I've been thinking a lot lately about exactly how altruistic I am. The truth is that I'm not sure: I care a lot about not dying, and about my girlfriend and family and friends not dying, and about all of humanity not dying, and about all life on this planet not dying too. And I care about the glorious transhuman future and all that, and the (or whatever) possible good future lives hanging in the balance.
And I care about some of these things disproportionately to their apparent moral magnitude. But, what I care about is what I care about. Rationality is the art of getting more of what you want, whatever that is; of systematized winning, by your own lights. You will totally fail in that art if you bulldoze your values in a desperate effort to fit in, or to be a "good" person, in the way your model of society seems to ask you to. What you ought to do instead is protect your brain's balance of undigested value-judgements: be corrigible to the person you will eventually, on reflection, grow up to be. Don't rush to lock in any bad, "good"-sounding values now; you are allowed to think for yourself and discover what you stably value.
It is not the Way to do what is "right," or even to do what is "right" instrumentally effectively. The Way is to get more of what you want and endorse on reflection, whatever that ultimately is, through instrumental efficacy. If you want that, you'll have to protect the kernel encoding those still-inchoate values, in order to ever-so-slowly tease out what those values are. How you feel is your only guide to what matters. Eventually, everything you care about could be generated from that wellspring.
The story involves a lot of "looking down" on people being ineffective because of confusion or out of underdevelopment. The is probably value in being aware of your value structure and not be in error about it but it also feels like that other agents might be categorised to be confused when they are just having a value structure that goes by different lines that you do.
Say that I care about wormless apples a lot and don't like wormfull apples. I encounter a lot of picks between an apple and an orange. Say that every time the apple has a worm I pick orange. From the perspective of an outsider (which might not have similar worm-detection abilities) it can seem that 100 times out of 1000 I go for the orange instead of the apple. That outsider might then be tempted to think that I am not following a utility function but I am being confused or contradictory in regards to my fruit buying. But in my terms and perception I am. And even if I only had a unconcious ough-field around worms that I non-conceptually intuit are there would that unawareness of the function make me not follow that function?
So if persons whos values are imperceptually complex and fragile to me appear the same as confused persons going around judging everybody that doesn't make sense in my terms that can make for trying to hammer people from unfamiliar shapes to familiar shapes which could be a form of xenophobia. I guess in Kelthams situation assuming that anybody different is lesser can make sense. And in the area of sexuality Keltham is not sure whether his world is strictly superior. But what about the areas where the comparison is not easy? The general form of Keltham learning about Cheliax tends to be of the form of "why people are this stupid?" being explained by there existing a bad environment. Markets are screwy because there are no good roads and not simply because traders are stupid. It tends to make fixing the situation more pressing but less judgy for the actors. Infrastructure that Keltham just took for granted is revealed to have conditions for existing.
In the story the "societally right" person in Cheliax is expected to be Evil (which has a narrower meaning in the setting). The bashing and shaming of peoples Good parts would work throught the same logic. There is a difference between concieving rationality as "the art of getting more of what you want" vs "the art of getting more of what you want". The point here is more about "expected to want" vs "actually want" and not about "others want" vs "I want".