Deception and Self-Doubt

8 Psychohistorian 11 March 2010 02:39AM

A little while ago, I argued with a friend of mine over the efficiency of the Chinese government. I admitted he was clearly better informed on the subject than I. At one point, however, he claimed that the Chinese government executed fewer people than the US government. This statement is flat-out wrong; China executes ten times as many people as the US, if not far more. It's a blatant lie. I called him on it, and he copped to it. The outcome is besides the point. Why does it matter that he lied? In this case, it provides weak evidence that the basics of his claim were wrong, that he knew the point he was arguing was, at least on some level, incorrect.

The fact that a person is willing to lie indefensibly in order to support their side of an argument shows that they have put "winning" the argument at the top of their priorities. Furthermore, they've decided, based on the evidence they have available, that lying was a more effective way to advance their argument than telling the truth. While exceptions obviously exist, if you believe that lying to a reasonably intelligent audience is the best way of advancing your claim, this suggests that you know your claim is ill-founded, even if you don't admit this fact to yourself.

continue reading »

Rational lies

6 alexflint 23 November 2009 03:32AM

If I were sitting opposite a psychopath who had a particular sensitivity about ants, and I knew that if I told him that ants have six legs then he would jump up and start killing the surrounding people, then it would be difficult to justify telling him my wonderful fact about ants, regardless of whether I believe that ants really have six legs or not.

Or suppose I knew my friend's wife was cheating on him, but I also knew that he was terminally ill and would die within the next few weeks. The question of whether or not to inform him of my knowledge is genuinely complex, and the truth or falsity of my knowledge about his wife is only one factor in the answer. Different people may disagree about the correct course of action, but no-one would claim that the only relevant fact is the truth of the statement that his wife is cheating on him.

This is all a standard result of expected utility maximization, of course. Vocalizing or otherwise communicating a belief is itself an action, and just like any other action it has a set of possible outcomes, to which we assign probabilities as well as some utility within our value coordinates. We then average out the utilities over the possible outcomes for each action, weighted by the probability that they will actually happen, and choose the action that maximizes this expected utility. Well, that's the gist of the situation, anyway. Much has been written on this site about the implications of expected utility maximization under more exotic conditions such as mind splitting and merging, but I'm going to be talking about more mundane situations, and the point I want to make is that beliefs are very different objects from the act of communicating those beliefs.

continue reading »

Light Arts

13 Alicorn 06 November 2009 03:54AM

tl;dr: It is worthwhile to convince people that they already, by their own lights, have reasons to believe true things, as this is faster, easier, nicer, and more effective than helping them create from scratch reasons to believe those things.

This is not part of the problem-solving sequence.  I do plan to finish that, but the last post is eluding me.

Related: Whatever it is I was thinking of here (let me know if you can dig up what it was).

Today, while waiting for a bus, I heard the two girls sitting on the bench next to mine talking about organ donation.  One said that she was thinking of ceasing to be an organ donor, because she'd heard that doctors don't try as hard to save donors in hopes of using their organs to save other lives.

My bus was approaching.  I didn't know the girl and could hardly follow up later with an arsenal of ironclad counterarguments.  There was no time, and probably no receptivity, to engage in a lengthy discussion of why this medical behavior wouldn't happen.  No chance to fire up my computer, try to get on the nearest wireless, and pull up empirical stats that say it doesn't happen.

So I chuckled and interjected, at a convenient gap in her ramble, "That's why you carry a blood donor card, too, so they think if you stay alive they'll keep getting blood from you!"

Some far-off potential tragic crisis averted?  Maybe.  She looked thoughtful, nodded, said that she did have a blood donor card, and that my suggestion made sense.  I boarded my bus and it carried me away.  I hope she's never hit by a cement truck.  I hope that if she is hit by a cement truck, a stupid rumor she heard once doesn't turn it into as complete a waste as it would have to be without the wonders of organ transplant.

continue reading »

The Price of Integrity

-5 Aurini 23 July 2009 04:30AM

Related Posts: Prices or Bindings?

On the evening of August 14th, 2006 a pair of Fox News journalists, Steve Centanni and Olaf Wiig were seized by Islamic militants while on assignment in Gaza City.  Nothing was heard of them for nine days until a group calling themselves the Holy Jihad Brigades took credit for the kidnappings.  They issued an ultimatum, demanding the release of Muslims prisoners from American jails within a 72 hour time frame.  Their demands were not met.

But then a few days later the journalists were allowed to go free... but not before they’d been forced into converting to Islam at gunpoint, and had each videotaped a statement denouncing U.S. and Israeli foreign policy.

The war raged on.

A couple of kidnapped journalists is nothing new (certainly not three years after the fact) and aside from the happy ending this particular case wouldn’t worth mentioning if not for a unique twist that occurred after they returned home.  A fellow Fox News contributor, Sandy Rios, openly criticized the two men; she said that no true Christian would convert – falsely or otherwise – merely because they were threatened with death.  As she later explained to Bill Maher:*

continue reading »

Not Technically Lying

32 Psychohistorian 04 July 2009 06:40PM

I'm sorry I took so long to post this. My computer broke a little while ago. I promise this will be relevant later.

A surgeon has to perform emergency surgery on a patient. No painkillers of any kind are available. The surgeon takes an inert saline IV and hooks it up to the patient, hoping that the illusion of extra treatment will make the patient more comfortable. The patient asks, "What's in that?" The doctor has a few options:

  1. "It's a saline IV. It shouldn't do anything itself, but if you believe it's a painkiller, it'll make this less painful.
  2. "Morphine."
  3. "The strongest painkiller I have."

-The first explanation is not only true, but maximizes the patient's understanding of the world.
-The second is obviously a lie, though, in this case, it is a lie with a clear intended positive effect: if the patient thinks he's getting morphine, then, due to the placebo effect, there is a very real chance he will experience less subjective pain.
-The third is, in a sense, both true and a lie. It is technically true. However, it's somewhat arbitrary; the doctor could have easily have said "It's the weakest painkiller I have," or "It's the strongest sedative I have," or any other number of technically true but misleading statements. This statement is clearly intended to mislead the hearer into thinking it is a potent painkiller; it promotes false beliefs while not quite being a false statement. It's Not Technically Lying. It seems that it deserves most, if not almost all, the disapproval that actually lying does; the truth does not save it. Because language does not specify single, clear meanings we can often use language where the obvious meaning is false and the non-obvious true, intentionally promoting false beliefs without false statements.

continue reading »

Honesty: Beyond Internal Truth

40 Eliezer_Yudkowsky 06 June 2009 02:59AM

When I expect to meet new people who have no idea who I am, I often wear a button on my shirt that says:

SPEAK THE TRUTH,
EVEN IF YOUR VOICE TREMBLES

Honesty toward others, it seems to me, obviously bears some relation to rationality.  In practice, the people I know who seem to make unusual efforts at rationality, are unusually honest, or, failing that, at least have unusually bad social skills.

And yet it must be admitted and fully acknowledged, that such morals are encoded nowhere in probability theory.  There is no theorem which proves a rationalist must be honest - must speak aloud their probability estimates.  I have said little of honesty myself, these past two years; the art which I've presented has been more along the lines of:

SPEAK THE TRUTH INTERNALLY,
EVEN IF YOUR BRAIN TREMBLES

I do think I've conducted my life in such fashion, that I can wear the original button without shame.  But I do not always say aloud all my thoughts.  And in fact there are times when my tongue emits a lie.  What I write is true to the best of my knowledge, because I can look it over and check before publishing.  What I say aloud sometimes comes out false because my tongue moves faster than my deliberative intelligence can look it over and spot the distortion.  Oh, we're not talking about grotesque major falsehoods - but the first words off my tongue sometimes shade reality, twist events just a little toward the way they should have happened...

From the inside, it feels a lot like the experience of un-consciously-chosen, perceptual-speed, internal rationalization.  I would even say that so far as I can tell, it's the same brain hardware running in both cases - that it's just a circuit for lying in general, both for lying to others and lying to ourselves, activated whenever reality begins to feel inconvenient.

continue reading »

Declare your signaling and hidden agendas

19 Kaj_Sotala 13 April 2009 12:01PM

Follow-up to: It's okay to be (at least a little) irrational

Many science journals require their authors to declare any competing interests they happen to have. For instance, if you're submitting a study about the health effects of tobacco, and you happen to sit on the board of directors of a major tobacco company, you're supposed to say that out loud. 

The process obviously isn't perfect, as most journals don't have the resources to ensure their authors do actually declare all competing interests. On the whole, though, it helps protect both the readers and the authors. The readers, because they'll know to be more careful in evaluating the reports of researchers who might be biased. The authors, because by declaring any competing interests upfront, they're protected from later accusations of dishonesty. (That's the theory, at least. In practice, authors often don't declare their interests, even if they should.)

Signaling has been discussed a lot on Overcoming Bias, though a bit less on Less Wrong. A large fraction of people's behavior is actually intended to signal some qualities to others, though this isn't necessarily a conscious process. On the other hand, it often is. As seasoned OB/LW readers, it seems to me like many would instinctively try to avoid giving the impression of excess signaling. We're rationalists, after all! We're trying to find the truth, not show off or impress others of our worth!

As if we even could avoid trying to make a good impression on others, or avoid having other kinds of hidden agendas. We're not any less human simply because we have rallied our rationality's banner. (Not to mention that signaling isn't a bad thing, by itself - humanity would be in a very poor state if we didn't have any signals about what others were like.) So, in the interest of self-honesty, I suggest we all begin explicitly declaring our (conscious) hidden agendas and signaling intentions when writing posts. As with the policy of scholarly journals, this will help both readers and writers, and in this case also serve a third and fourth function - making us more honest to ourselves, and make people realize that it's okay to have hidden agendas, and that they don't have to pretend they don't have any. I'll start out with mine.

continue reading »

Of Lies and Black Swan Blowups

15 Eliezer_Yudkowsky 07 April 2009 06:26PM

Followup toEntangled Truths, Contagious Lies

Judge Marcus Einfeld, age 70, Queens Counsel since 1977, Australian Living Treasure 1997, United Nations Peace Award 2002, founding president of Australia's Human Rights and Equal Opportunities Commission, retired a few years back but routinely brought back to judge important cases...

...is going to jail for at least two years over a series of perjuries and lies that started with a £36, 6mph-over speeding ticket.

That whole suspiciously virtuous-sounding theory about honest people not being good at lying, and entangled traces being left somewhere, and the entire thing blowing up in a Black Swan epic fail, actually does have a certain number of exemplars in real life, though obvious selective reporting is at work in our hearing about this one.

 

Part of the Against Rationalization subsequence of How To Actually Change Your Mind

Next post: "Dark Side Epistemology"

Previous post: "Entangled Truths, Contagious Lies"

Degrees of Radical Honesty

30 MBlume 31 March 2009 08:36PM

The Black Belt Bayesian writes:

Promoting less than maximally accurate beliefs is an act of sabotage. Don’t do it to anyone unless you’d also slash their tires, because they’re Nazis or whatever.

Eliezer adds:

If you'll lie when the fate of the world is at stake, and others can guess that fact about you, then, at the moment when the fate of the world is at stake, that's the moment when your words become the whistling of the wind.

These are both radically high standards of honesty. Thus, it is easy to miss the fact that they are radically different standards of honesty. Let us look at a boundary case.

Thomblake puts the matter vividly:

Suppose that Anne Frank is hiding in the attic, and the Nazis come asking if she's there. Harry doesn't want to tell them, but Stan insists he mustn't deceive the Nazis, regardless of his commitment to save Anne's life.

So, let us say that you are living in Nazi Germany, during WWII, and you have a Jewish family hiding upstairs. There's a couple of brownshirts with rifles knocking on your door. What do you do?

continue reading »

View more: Next