That's an interesting case because, if the Nazi is well-informed about my goals, he will probably be aware that I'd lie to him for things short of the end of the world and he could easily suspect that I'm falsely informing him of this risk in order to get him not to blow up people I'd prefer to leave intact. If all he knows about my goals is that I don't want the world to end, whether he heeds my warnings depends on his uninformed guess about the rest of my beliefs, which could fall either way.
That's why I think that if, say, a scientist were tempted by the Noble Lie "this bomb would actually destroy the whole earth, we cannot work on it any further," this would be a terrible decision. By the same logic that says I hand Omega $100 so that counterfactual me gets $10000, I should not attempt to lie about such a risk so that counterfactual me can be believed where the risk actually exists
The Black Belt Bayesian writes:
Eliezer adds:
These are both radically high standards of honesty. Thus, it is easy to miss the fact that they are radically different standards of honesty. Let us look at a boundary case.
Thomblake puts the matter vividly:
So, let us say that you are living in Nazi Germany, during WWII, and you have a Jewish family hiding upstairs. There's a couple of brownshirts with rifles knocking on your door. What do you do?
I see four obvious responses to this problem (though there may be more)
I am certain that YVain could have a field day with the myriad ways in which response 4 does not represent rational discourse. Nonetheless, in this limited problem, it wins.
(It should also be noted that response 4 came to me in about 15 minutes of thinking about the problem. If I actually had Jews in my attic, and lived in Nazi Germany, I might have thought of something better).
However:
What if you live in the impossible possible world in which a nuclear blast could ignite the atmosphere of the entire earth? What if you are yourself a nuclear scientist, and have proven this to yourself beyond any doubt, but cannot convey the whole of the argument to a layman? The fate of the whole world could depend on your superiors believing you to be the sort of man who will not tell a lie. And, of course, in order to be the sort of man who would not tell a lie, you must not tell lies.
Do we have wiggle room here? Neither your superior officer, nor the two teenaged brownshirts, are Omega, but your superior bears a far greater resemblance. The brownshirts are young, are ruled by hormones. It is easy to practice the Dark Arts against them, and get away with it. Is it possible to grab the low-hanging fruit to be had by deceiving fools (at least, those who are evil and whose tires you would willingly slash), while retaining the benefits of being believed by the wise?
I am honestly unsure, and so I put the question to you all.
ETA: I have of course forgotten about the unrealistically optimistic option:
5: Really, truly, promote maximally accurate beliefs. Teach the soldiers rationality from the ground up. Explain to them about affective death spirals, and make them see that they are involved in one. Help them to understand that their own morality assigns value to the lives hidden upstairs. Convince them to stop being nazis, and to help you protect your charges.
If you can pull this off without winding up in a concentration camp yourself (along with the family you've been sheltering) you are a vastly better rationalist than I, or (I suspect) anyone else on this forum.