A small scale and probably more common example-- a friend who lost his job because his co-workers and his immediate boss didn't trust him because he wouldn't pad his expense account.
I think the problem was that having a non-standard moral system meant he was too unpredictable.
Admittedly, (and I only have his account of his life) there was another problem-- he has Asperger's and he's a large guy. Neurotypicals (perhaps only neuroytypical men) would automatically see him as physically threatening.
If you want to offer advice about the situation he was in, please make it hypothetical. At this point, his depression and anxiety are bad enough that he's on disability and not in the job market.
For another example, note that whistle-blowers need legal protection to keep their jobs. In the case of the man who released the Abu Graib photos, he needed protection because of death threats.
Where does signaling absolute honesty fit in a world where many people make it unwelcome?
A small scale and probably more common example--a friend who lost his job because his co-workers and his immediate boss didn't trust him because he wouldn't pad his expense account.
They were probably worried that he would inform on others for padding their expense accounts. Someone who follows a stricter set of moral rules, but doesn't plan to force those rules on others, should be sure to clarify that they don't mind others following less strict rules, within reason, and wouldn't make trouble over them.
The Black Belt Bayesian writes:
Eliezer adds:
These are both radically high standards of honesty. Thus, it is easy to miss the fact that they are radically different standards of honesty. Let us look at a boundary case.
Thomblake puts the matter vividly:
So, let us say that you are living in Nazi Germany, during WWII, and you have a Jewish family hiding upstairs. There's a couple of brownshirts with rifles knocking on your door. What do you do?
I see four obvious responses to this problem (though there may be more)
I am certain that YVain could have a field day with the myriad ways in which response 4 does not represent rational discourse. Nonetheless, in this limited problem, it wins.
(It should also be noted that response 4 came to me in about 15 minutes of thinking about the problem. If I actually had Jews in my attic, and lived in Nazi Germany, I might have thought of something better).
However:
What if you live in the impossible possible world in which a nuclear blast could ignite the atmosphere of the entire earth? What if you are yourself a nuclear scientist, and have proven this to yourself beyond any doubt, but cannot convey the whole of the argument to a layman? The fate of the whole world could depend on your superiors believing you to be the sort of man who will not tell a lie. And, of course, in order to be the sort of man who would not tell a lie, you must not tell lies.
Do we have wiggle room here? Neither your superior officer, nor the two teenaged brownshirts, are Omega, but your superior bears a far greater resemblance. The brownshirts are young, are ruled by hormones. It is easy to practice the Dark Arts against them, and get away with it. Is it possible to grab the low-hanging fruit to be had by deceiving fools (at least, those who are evil and whose tires you would willingly slash), while retaining the benefits of being believed by the wise?
I am honestly unsure, and so I put the question to you all.
ETA: I have of course forgotten about the unrealistically optimistic option:
5: Really, truly, promote maximally accurate beliefs. Teach the soldiers rationality from the ground up. Explain to them about affective death spirals, and make them see that they are involved in one. Help them to understand that their own morality assigns value to the lives hidden upstairs. Convince them to stop being nazis, and to help you protect your charges.
If you can pull this off without winding up in a concentration camp yourself (along with the family you've been sheltering) you are a vastly better rationalist than I, or (I suspect) anyone else on this forum.