MBlume comments on Degrees of Radical Honesty - Less Wrong

30 Post author: MBlume 31 March 2009 08:36PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (50)

You are viewing a single comment's thread. Show more comments above.

Comment author: Alicorn 01 April 2009 12:48:44AM 14 points [-]

Telling the truth is an expression of trust, in addition to being a way to earn it: telling someone something true that could be misused is saying "I trust you to behave appropriately with this information". The fact that I would lie to the brownshirts as convincingly as possible shouldn't cause anyone else to mistrust me as long as 1) they know my goals; 2) I know their goals and they know that I do; 3) our goals align, at least contextually; and 4) they know that I'm not just a pathological liar who'll lie for no reason. The Nazis will be misled about (1), because that's the part of their knowledge I can manipulate most directly, but anyone with whom I share much of a trust relationship (the teenage daughter playing the piano, perhaps) will know better, because they'll be aware that I'm sheltering Jews and lying to Nazis.

The fact that I would lie to save the world should only cause someone to mistrust my statements on the eve of the apocalypse if they think that I think that they don't want to save the world.

Comment author: MBlume 01 April 2009 01:11:40AM 3 points [-]

What if you need to explain to a nazi general that the bomb he's having developed could destroy the world? Your goals don't align, except in the fairly narrow sense that neither of you wants to destroy the world.

Comment author: Alicorn 01 April 2009 03:00:08PM 2 points [-]

That's an interesting case because, if the Nazi is well-informed about my goals, he will probably be aware that I'd lie to him for things short of the end of the world and he could easily suspect that I'm falsely informing him of this risk in order to get him not to blow up people I'd prefer to leave intact. If all he knows about my goals is that I don't want the world to end, whether he heeds my warnings depends on his uninformed guess about the rest of my beliefs, which could fall either way.

Comment author: MBlume 01 April 2009 07:27:43PM 0 points [-]

That's why I think that if, say, a scientist were tempted by the Noble Lie "this bomb would actually destroy the whole earth, we cannot work on it any further," this would be a terrible decision. By the same logic that says I hand Omega $100 so that counterfactual me gets $10000, I should not attempt to lie about such a risk so that counterfactual me can be believed where the risk actually exists