Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

The Sin of Underconfidence

55 Post author: Eliezer_Yudkowsky 20 April 2009 06:30AM

There are three great besetting sins of rationalists in particular, and the third of these is underconfidence.  Michael Vassar regularly accuses me of this sin, which makes him unique among the entire population of the Earth.

But he's actually quite right to worry, and I worry too, and any adept rationalist will probably spend a fair amount of time worying about it.  When subjects know about a bias or are warned about a bias, overcorrection is not unheard of as an experimental result.  That's what makes a lot of cognitive subtasks so troublesome—you know you're biased but you're not sure how much, and you don't know if you're correcting enough—and so perhaps you ought to correct a little more, and then a little more, but is that enough?  Or have you, perhaps, far overshot?  Are you now perhaps worse off than if you hadn't tried any correction?

You contemplate the matter, feeling more and more lost, and the very task of estimation begins to feel increasingly futile...

And when it comes to the particular questions of confidence, overconfidence, and underconfidence—being interpreted now in the broader sense, not just calibrated confidence intervals—then there is a natural tendency to cast overconfidence as the sin of pride, out of that other list which never warned against the improper use of humility or the abuse of doubt.  To place yourself too high—to overreach your proper place—to think too much of yourself—to put yourself forward—to put down your fellows by implicit comparison—and the consequences of humiliation and being cast down, perhaps publicly—are these not loathesome and fearsome things?

To be too modest—seems lighter by comparison; it wouldn't be so humiliating to be called on it publicly, indeed, finding out that you're better than you imagined might come as a warm surprise; and to put yourself down, and others implicitly above, has a positive tinge of niceness about it, it's the sort of thing that Gandalf would do.

So if you have learned a thousand ways that humans fall into error and read a hundred experimental results in which anonymous subjects are humiliated of their overconfidence—heck, even if you've just read a couple of dozen—and you don't know exactly how overconfident you are—then yes, you might genuinely be in danger of nudging yourself a step too far down.

I have no perfect formula to give you that will counteract this.  But I have an item or two of advice.

What is the danger of underconfidence?

Passing up opportunities.  Not doing things you could have done, but didn't try (hard enough).

So here's a first item of advice:  If there's a way to find out how good you are, the thing to do is test it.  A hypothesis affords testing; hypotheses about your own abilities likewise.  Once upon a time it seemed to me that I ought to be able to win at the AI-Box Experiment; and it seemed like a very doubtful and hubristic thought; so I tested it.  Then later it seemed to me that I might be able to win even with large sums of money at stake, and I tested that, but I only won 1 time out of 3.  So that was the limit of my ability at that time, and it was not necessary to argue myself upward or downward, because I could just test it.

One of the chief ways that smart people end up stupid, is by getting so used to winning that they stick to places where they know they can win—meaning that they never stretch their abilities, they never try anything difficult.

It is said that this is linked to defining yourself in terms of your "intelligence" rather than "effort", because then winning easily is a sign of your "intelligence", where failing on a hard problem could have been interpreted in terms of a good effort.

Now, I am not quite sure this is how an adept rationalist should think about these things: rationality is systematized winning and trying to try seems like a path to failure.  I would put it this way:  A hypothesis affords testing!  If you don't know whether you'll win on a hard problem—then challenge your rationality to discover your current level.  I don't usually hold with congratulating yourself on having tried—it seems like a bad mental habit to me—but surely not trying is even worse.  If you have cultivated a general habit of confronting challenges, and won on at least some of them, then you may, perhaps, think to yourself "I did keep up my habit of confronting challenges, and will do so next time as well".  You may also think to yourself "I have gained valuable information about my current level and where I need improvement", so long as you properly complete the thought, "I shall try not to gain this same valuable information again next time".

If you win every time, it means you aren't stretching yourself enough.  But you should seriously try to win every time.  And if you console yourself too much for failure, you lose your winning spirit and become a scrub.

When I try to imagine what a fictional master of the Competitive Conspiracy would say about this, it comes out something like:  "It's not okay to lose.  But the hurt of losing is not something so scary that you should flee the challenge for fear of it.  It's not so scary that you have to carefully avoid feeling it, or refuse to admit that you lost and lost hard.  Losing is supposed to hurt.  If it didn't hurt you wouldn't be a Competitor.  And there's no Competitor who never knows the pain of losing.  Now get out there and win."

Cultivate a habit of confronting challenges—not the ones that can kill you outright, perhaps, but perhaps ones that can potentially humiliate you.  I recently read of a certain theist that he had defeated Christopher Hitchens in a debate (severely so; this was said by atheists).  And so I wrote at once to the Bloggingheads folks and asked if they could arrange a debate.  This seemed like someone I wanted to test myself against.  Also, it was said by them that Christopher Hitchens should have watched the theist's earlier debates and been prepared, so I decided not to do that, because I think I should be able to handle damn near anything on the fly, and I desire to learn whether this thought is correct; and I am willing to risk public humiliation to find out.  Note that this is not self-handicapping in the classic sense—if the debate is indeed arranged (I haven't yet heard back), and I do not prepare, and I fail, then I do lose those stakes of myself that I have put up; I gain information about my limits; I have not given myself anything I consider an excuse for losing.

Of course this is only a way to think when you really are confronting a challenge just to test yourself, and not because you have to win at any cost.  In that case you make everything as easy for yourself as possible.  To do otherwise would be spectacular overconfidence, even if you're playing tic-tac-toe against a three-year-old.

A subtler form of underconfidence is losing your forward momentum—amid all the things you realize that humans are doing wrong, that you used to be doing wrong, of which you are probably still doing some wrong.  You become timid; you question yourself but don't answer the self-questions and move on; when you hypothesize your own inability you do not put that hypothesis to the test.

Perhaps without there ever being a watershed moment when you deliberately, self-visibly decide not to try at some particular test... you just.... slow..... down......

It doesn't seem worthwhile any more, to go on trying to fix one thing when there are a dozen other things that will still be wrong...

There's not enough hope of triumph to inspire you to try hard...

When you consider doing any new thing, a dozen questions about your ability at once leap into your mind, and it does not occur to you that you could answer the questions by testing yourself...

And having read so much wisdom of human flaws, it seems that the course of wisdom is ever doubting (never resolving doubts), ever the humility of refusal (never the humility of preparation), and just generally, that it is wise to say worse and worse things about human abilities, to pass into feel-good feel-bad cynicism.

And so my last piece of advice is another perspective from which to view the problem—by which to judge any potential habit of thought you might adopt—and that is to ask:

Does this way of thinking make me stronger, or weaker?  Really truly?

I have previously spoken of the danger of reasonableness—the reasonable-sounding argument that we should two-box on Newcomb's problem, the reasonable-sounding argument that we can't know anything due to the problem of induction, the reasonable-sounding argument that we will be better off on average if we always adopt the majority belief, and other such impediments to the Way.  "Does it win?" is one question you could ask to get an alternate perspective.  Another, slightly different perspective is to ask, "Does this way of thinking make me stronger, or weaker?"  Does constantly reminding yourself to doubt everything make you stronger, or weaker?  Does never resolving or decreasing those doubts make you stronger, or weaker?  Does undergoing a deliberate crisis of faith in the face of uncertainty make you stronger, or weaker?  Does answering every objection with a humble confession of you fallibility make you stronger, or weaker?

Are your current attempts to compensate for possible overconfidence making you stronger, or weaker?  Hint:  If you are taking more precautions, more scrupulously trying to test yourself, asking friends for advice, working your way up to big things incrementally, or still failing sometimes but less often then you used to, you are probably getting stronger.  If you are never failing, avoiding challenges, and feeling generally hopeless and dispirited, you are probably getting weaker.

I learned the first form of this rule at a very early age, when I was practicing for a certain math test, and found that my score was going down with each practice test I took, and noticed going over the answer sheet that I had been pencilling in the correct answers and erasing them.  So I said to myself, "All right, this time I'm going to use the Force and act on instinct", and my score shot up to above what it had been in the beginning, and on the real test it was higher still.  So that was how I learned that doubting yourself does not always make you stronger—especially if it interferes with your ability to be moved by good information, such as your math intuitions.  (But I did need the test to tell me this!)

Underconfidence is not a unique sin of rationalists alone.  But it is a particular danger into which the attempt to be rational can lead you.  And it is a stopping mistake—an error which prevents you from gaining that further experience which would correct the error.

Because underconfidence actually does seem quite common among aspiring rationalists who I meet—though rather less common among rationalists who have become famous role models)—I would indeed name it third among the three besetting sins of rationalists.

 

Part of the sequence The Craft and the Community

Next post: "Well-Kept Gardens Die By Pacifism"

Previous post: "My Way"

Comments (176)

Comment author: [deleted] 20 April 2009 10:05:41AM 25 points [-]

I wonder if the decline of apprenticeships has made overconfidence and underconfidence more common and more severe.

I'm not a history expert, but it seems to me that a blacksmith's apprentice 700 years ago wouldn't have had to worry about over/underconfidence in his skill. (Gender-neutral pronouns intentionally not used here!) He would have known exactly how skilled he was by comparing himself to his master every day, and his master's skill would have been a known quantity, since his master had been accepted by a guild of mutually recognized masters.

Nowadays, because of several factors, calibrating your judgement of your skill seems to be a lot harder. Our education system is completely different, and regardless of whatever else it does, it doesn't seem to be very good at providing reliable feedback to its students, who properly understand the importance of the feedback and respond accordingly. Our blacksmith's apprentice (let's call him John) knows when he's screwed up - the sword or whatever that he's made breaks, or his master points out how it's flawed. And John knows why this is important - if he doesn't fix the problem, he's not going to be able to earn a living.

Whereas a modern schoolkid (let's call him Jaden) may be absolutely unprepared to deal with math, but he doesn't know exactly how many years he's behind (it's hard enough to get this information in aggregate, and it seems to be rarely provided to the students themselves on an individual basis - no one is told "you are 3 years behind where you ought to be"). And Jaden has absolutely no clue why that matters, since the link between math and his future employment isn't obvious to him, and no one's explaining it to him. (School isn't for learning; as Paul Graham has explained, "Officially the purpose of schools is to teach kids. In fact their primary purpose is to keep kids locked up in one place for a big chunk of the day so adults can get things done. And I have no problem with this: in a specialized industrial society, it would be a disaster to have kids running around loose.")

Another modern schoolkid (let's call her Jaina) may be really skilled at math, but testing won't indicate this strongly enough (it works both ways; tests saturate at the high end - especially if they're targeting a low level of achievement for the rest of the class - and "you are 3 years ahead of everyone else in this room" is not feedback that is commonly given). And there's a good chance it won't be obvious to her how important this is, and how important becoming even more skilled is. And if she ends up being underconfident in her ability, and the feedback loop ("I know how skilled I am, I know why becoming stronger is important, and I know what I need to do") isn't established, then instead of learning plasma physics and working on ITER or DEMO, she goes into marketing or something. Maybe doing worthy things, but not being as awesome as she could have been.

My point, after this wondering, is that I agree with this post, and want to elaborate: structuring what you do so that you test yourself in the process of doing it is a good way to establish a feedback loop that increases your skill and the accuracy of your confidence in it. I find nothing wrong with the debating example in this post, but I worry that it makes self-testing sound like something that you should go out and do, separate from your everyday work. (Part of this, I think, is due to Eliezer's very unusual occupation.) My usual self-testing example is something like "can I write this program correctly on the very first try?". That's a hard challenge, integrated into my everyday work. Successfully completing it, or coming close, has allowed me to build up my skill ("the compiler in my head") and avoid the danger of underconfidence.

Comment author: Will_Newsome 09 August 2010 05:41:04PM 12 points [-]

A friend of mine, normal in most ways, has exceptionally good mental imagery, such that one time she visited my house and saw a somewhat complex 3-piece metalwork puzzle in my living room and thought about it later that evening after she had left, and was able to solve it within moments of picking it up when she visited a second time. At first I was amazed at this, but I soon became more amazed that she didn't find this odd, and that no one had ever realized she had any particular affinity for this kind of thing in all the time she'd been in school. I'm curious as to how many cognitive skills like this there are to excel at and if many people are actually particularly good at one or many of them without realizing it due to a lack of good tests for various kinds of cognition.

Comment author: LongInTheTooth 20 April 2009 02:50:37PM 8 points [-]

Without risk, there is no growth.

If your practice isn't making you feel scared and uncomfortable, it's not helping. Imagine training for a running race without any workouts that raise your heart rate and make you breathe hard.

Feeling out of your comfort zone and at risk of failure is something everybody should seek out on a regular basis.

Comment author: Eliezer_Yudkowsky 20 April 2009 04:15:30PM 12 points [-]

My usual self-testing example is something like "can I write this program correctly on the very first try?". That's a hard challenge, integrated into my everyday work.

I should try to remember to try this the next time I have a short piece of code to write. Furthermore, it's the sort of thing that makes me slightly uncomfortable and is therefore easy to forget, so I should try harder to remember it.

In general, this sort of thing seems like a very useful technique if you can do it without endangering your work. Modded parent up.

Comment author: DanielLC 08 May 2013 06:02:08AM 0 points [-]

My usual self-testing example is something like "can I write this program correctly on the very first try?".

I never thought of that as a thing you could do. I think when my code compiles on the first try, it's more often then not a sign of something very wrong. For example, the last time it happened was because I forgot to add the file I was working on to the makefile.

Perhaps I should try to learn to code more precisely.

Comment author: [deleted] 08 May 2013 06:36:13AM 0 points [-]

Heh. (You should use makefiles that automatically build new files, and automatically sense dependencies for rebuild.)

As I recall, Eliezer said somewhere that I'm too tired to Google - there is no limit to the amount of intelligence that you can use while programming.

Comment author: ciphergoth 20 April 2009 07:09:36AM 20 points [-]

it was said by them that Christopher Hitchens should have watched the theist's earlier debates and been prepared, so I decided not to do that

I urge you to prepare properly. Not only Hitchens but Richard Carrier and several other atheists have been humiliated in debate with him, by their own admission. Winning at all is challenge enough, and would be a great service to the world. Given how much of a blow you would find it to lose having fully prepared, I urge you to to reconsider whether you're self-handicapping.

Comment author: CronoDAS 20 April 2009 06:29:39PM *  24 points [-]

Scientists are frequently advised to never participate in a live debate with a creationist. This is because being right has absolutely nothing to do with winning.

"Debating creationists on the topic of evolution is rather like playing chess with a pigeon - it knocks the pieces over, craps on the board, and flies back to its flock to claim victory." -- Scott D. Weitzenhoffer

Debates are not a rationality competition. They're a Dark Arts competition, in which the goal is to use whatever underhanded trick you can come up with in order to convince somebody to side with you. Evidence doesn't matter, because it's trivial to simply lie your ass off and get away with it.

The only kind of debates worth having are written debates, in which, when someone tells a blatant lie, you can look up the truth somewhere and take all the space you need to explain why it's a lie - and "cite your sources, or you forefeit" is a reasonable rule.

Comment author: CannibalSmith 20 April 2009 10:42:47AM *  13 points [-]

Indeed. Association fallacy. Eliezer might not think much of his loss, but it would still be seen by people as a loss for "the atheists" and a victory for "the theists". Debate to win!

Comment author: Yvain 20 April 2009 07:22:58AM *  9 points [-]

Who is this theist? I'm interested in watching these debates. (though obviously without knowledge of the specific case, I agree with ciphergoth. It's not just about you, it's about whoever's watching.)

Comment author: gjm 20 April 2009 09:30:26AM *  6 points [-]

I agree with ciphergoth's guess.

Eliezer: I agree with ciphergoth and Yvain. Debating, at least as the Theist Who (Apparently) Must Not Be Named is concerned, is a performance art more than it is a form of intellectual inquiry, and unless you've done a lot of it you run the severe risk of getting eaten by someone who has, especially if you decide to handicap yourself. If you engage in such a debate, the chances are that at least some people will watch or hear it, or merely learn of the result, and change their opinions as a result. (Probably few will change so far as to convert or deconvert: maybe none. Many will find that their views become more or less entrenched.)

What would you think of a musician who decided to give a public performance without so much as looking at the piece she was going to play? Would you not be inclined to say: "It's all very well to test yourself, but please do it in private"?

(For what it's worth, I think it's rather unlikely that TTWMNBN will agree to a Bloggingheads-style debate. He would want it to be public. And he might decide that Eliezer isn't high-enough-profile to be worth debating. Remember: for him, among other things, this is propaganda.)

[EDITED a few minutes after posting to remove the explicit mention of the theist's name]

Comment author: ciphergoth 20 April 2009 10:49:26AM 1 point [-]

For what it's worth, I think it's rather unlikely that TTWMNBN will agree to a Bloggingheads-style debate. He would want it to be public. And he might decide that Eliezer isn't high-enough-profile to be worth debating. Remember: for him, among other things, this is propaganda.

Entirely agreed. There's a chance such a debate could be arranged if the book is a success, though.

Comment author: JulianMorrison 20 April 2009 09:33:53AM *  -1 points [-]

Rot13 is your friend. (Edit: fixed above)

Comment author: gjm 20 April 2009 09:38:54AM 1 point [-]

I already knew that, as you might have inferred from "I agree with ciphergoth's guess" and, er, the fact that I named him in my last paragraph. (That was an oversight which I am about to correct.) Perhaps I should have been more explicit about what guess I was agreeing with.

I don't know why the coyness, but perhaps TTWMNBN is suspected of googling for his own name every now and then. Or perhaps ciphergoth was just (semi-)respecting Eliezer's decision not to name him.

Comment author: ciphergoth 20 April 2009 10:49:50AM 0 points [-]

Semi-respecting.

Comment author: AllanCrossman 20 April 2009 02:19:38PM 8 points [-]

But you haven't really not named him. Anyone can decipher these posts with a small amount of effort. All that's happened is that this thread has become annoying to read.

Comment author: ciphergoth 20 April 2009 07:40:56AM 3 points [-]

Jvyyvnz Ynar Penvt (I'm guessing; certainly the only time I've heard it credibly said that Hitchens lost a debate with a theist)

Comment author: JulianMorrison 20 April 2009 09:27:48AM -1 points [-]

Well, I already know the proper counter to his pet argument. Hat tip, Tnel Qerfpure for explaining gvzr.

Comment author: ciphergoth 20 April 2009 09:39:14AM *  3 points [-]

He has piles of pet arguments, that's part of his technique; he fires off so many arguments that you can't answer them all. I've watched him and put a lot of thought into how I'd answer him, and I'm still not sure how I can fit the problems with his arguments into the time available in a debate, but I'd start with asking either him or the audience to pick which of his arguments I was going to counter in my reply.

In particular, I still don't have a counter to the fine-tuning argument which is short, assumes no foreknowledge, and is entirely intellectually honest.

Could you point me to the counter argument you rot-13? Google isn't finding it for me. Thanks!

Comment author: drnickbone 16 June 2012 11:41:20AM 11 points [-]

In particular, I still don't have a counter to the fine-tuning argument which is short, assumes no foreknowledge, and is entirely intellectually honest.

The "fine-tuning" argument falls into the script:

  1. Here is a puzzle that scientists can't currently explain
  2. God explains it
  3. Therefore God exists

If you accept that script you lose the debate, because there will always be some odd fact that can't currently be explained. (And even if it can actually be explained, you won't have time to explain it within the limits of the debate and the audience's knowledge.)

The trap is that it is a very temping mistake to try and solve the puzzle yourself. It's highly unlikely that you will succeed, and your opponent will already know the flaws (and counter-arguments) to most of the existing solution attempts, so can throw those at you. Or if you support a fringe theory (which isn't generally considered in the solution space, but might work), the opponent can portray you as a marginal loon.

I suspect that the theist wins these debates because most opponents fall into that trap. They are smart enough that they think that they can resolve the puzzle in question, and so walk right into it. By debating domain experts, the theist positively invites them into the trap.

How I might respond. "I can't currently explain the values of physical constants, and as far as I'm aware no-one else can either. If you think you have an explanation, you can do the scientific community a great service. Just formulate your 'God' theory as a set of equations from which we can derive those values, including some values or degrees of precision that we don't currently know. Propose experiments by which we can test that theory. Submit to a leading physics journal, and get physicists to perform the experiments. When you've done that, you can claim evidence for your theory, and I will be more inclined to support it. You can't do it though, can you?"

Comment author: Psychohistorian 20 April 2009 03:34:53PM 6 points [-]

The anthropic principle does technically work, but it admittedly feels like a cheat and I'd expect most audiences not familiar with it already would consider it such.

It's not a knock-down counterargument, but it seems to me we don't know enough about physics to say it's actually possible that the universe could be fine-tuned differently. Sure, we can look at a lot of fundamental constants and say, "If that one were different by 1 unit, fusion wouldn't occur," but we don't know if they are interconnected, and I don't think we can accurate model what would occur, so it's possible that it couldn't be different, that other constants would vary with it, and/or that it would make a universe so entirely different from our own that we have no idea what it would be like, so it's quite possible it could support life of some form.

Or, reduced into something more succinct, we don't actually know what the universe would look like if we changed fundamental constants (if this is even possible) because the results are beyond our ability to model, so it's quite possible that most possible configurations would support some form of life.

Multiverse works too, but again feels like cheating. I also admit there may be facts that undermine this, I'm not super-familiar with the necessary physics.

Comment author: Furcas 20 April 2009 06:45:05PM *  3 points [-]

If there is no multiverse, "Why is the universe the way it is rather than any other way?" is a perfectly good question to which we haven't found the answer yet. However, theists don't merely ask that question, they use our ignorance as an argument for the existence of a deity. They think a creator is the best explanation for fine-tuning. The obvious counter-argument is that not only is a creator not the best explanation, it's not an explanation at all. We can ask the exact same question about the creator that we asked about the universe: Why is the creator what it is rather than something else? Why isn't 'He' something that couldn't be called a 'creator' at all, like a quark, or a squirrel? Or, to put the whole thing in the right perspective, why is the greater universe formed by the combination of our universe and its creator the way it is, rather than any other way?

At this point the theist usually says that God is necessary, or outside of time, which could just as easily be true of the universe as we know it. Or the theist might say that God is eternal, while our universe probably isn't, which is irrelevant. None of these alleged characteristics of God's explain why He's fine-tuned, anyway.

Comment author: billswift 20 April 2009 01:25:10PM 1 point [-]

I haven't read this particular version of the fine-tuning argument, but the general counter-argument is that evolution fine-tuned life (humans) for the universe, not that the universe was fine-tuned for humans.

Comment author: ciphergoth 20 April 2009 02:06:26PM *  5 points [-]

Unfortunately, that doesn't work. Without the fine tuning, the Universe consists of undifferentiated mush, and evolution is impossible.

Comment author: billswift 20 April 2009 03:33:25PM 1 point [-]

That isn't any version of the fine tuning argument I've heard. And it just sounds plain stupid. Who makes this particular argument, and more importantly how do they justify it? It sounds like some wild claim that is just too irrational to refute.

Comment author: timtyler 20 April 2009 06:11:33PM 2 points [-]

To me it sounds commonplace. What is the problem you see?

Comment author: AllanCrossman 20 April 2009 02:07:29PM 2 points [-]

evolution fine-tuned life (humans) for the universe, not that the universe was fine-tuned for humans.

I don't think this is good enough. There seem to be several physical constants that - if they had been slightly different - would have made any sort of life unlikely.

Comment author: Alicorn 20 April 2009 02:33:27PM *  2 points [-]

That part can be deproblematized (if you will forgive the nonce word) by the anthropic principle: if the universe were unsuited for life, there would be no life to notice that and remark upon it.

Comment author: AllanCrossman 20 April 2009 02:45:32PM *  3 points [-]

if the universe were unsuited for life, there would be no life to notice that and remark upon it.

True. But since a universe unsuitable for life seems overwhelmingly the more probable situation, we can still ask why it isn't so.

(My own feeling is that the problem has to be resolved by either "God" or "a multiverse". The idea that there's precisely one universe and it just happens to have the conditions for life seems extraordinary.)

Comment author: Psy-Kosh 20 April 2009 08:56:28PM 3 points [-]

My understanding (I'd have to dig out references) is that the fine tuning may not be as fine as generally believed. Ah, the wikipedia page on the argument has some references on this: http://en.wikipedia.org/wiki/Fine-tuned_Universe#Disputes_on_the_existence_of_fine-tuning

In addition to the anthropic type arguments, some theoretical work seems to suggest that the fine tuning isn't. ie, that we don't even need to invoke anthropic reasoning too strongly. Heck, supposedly one can even have stars in a universe with no weak interaction at all.

So it may very well be that, even without appealing to anthropic style reasoning in multiverses (which I'm not actually opposed to, but there's stuff there that I still don't understand. Born stats, apparent breakdown of the Aumann Agreement Theorem, etc... so too easy to get stuff wrong) anyways, even without that, it may well be that the fine tuning stuff can be refuted by simply pointing out "looking at the actual physics, the tuning is rather less fine than claimed."

Comment author: AlexU 20 April 2009 02:40:00PM 3 points [-]

I agree, but the anthropic principle has always seemed like a bit of cheat -- an explanation that really isn't much of an explanation at all.

Comment author: DanielLC 08 May 2013 06:10:01AM 2 points [-]

I don't accept that form of the anthropic principle. I am on a planet, even though planets make up only a tiny portion of the universe, because there's (almost) nobody not on a planet to remark on it. The anthropic principle says that you will be where a person is. However, it can't change the universe. The laws of physics aren't going to rewrite themselves just because there was nobody there to see them.

That being said, if you combine this with multiple universes, it works. The multiverse is obviously suitable for life somewhere. We are going to end up in one of those places.

Comment author: Luke_A_Somers 08 May 2013 02:48:38PM 1 point [-]

Even in the case of a single infinite universe, the anthropic principle does help - it means that any arbitrarily low success rate for forming life is equally acceptable, so long as it is not identically zero.

Comment author: byrnema 21 April 2009 12:37:39AM *  1 point [-]

Exactly. The parameters we have define this universe. Any complex system -- presumably most if not all universes -- would have complex patterns. You would just need patterns to evolve that are self-promoting (i.e., accumulative) and evolving, and eventually a sub-pattern will evolve that significantly meta-references. Given that replicating forms can result from simple automata rules and self-referencing appears randomly (in a formal sense) all over the place in a random string (Godel) it doesn't seem so improbable for such a pattern to emerge. In fact, an interesting question is why is there only one "life" that we know of (i.e., carbon-based)? Once we understand the mechanism of consciousness, we may find that it duplicates elsewhere -- perhaps not in patterns that are accumulative and evolving but briefly, spontaneously. This is totally idle speculation of course.

Another argument: There's nothing in Physics that says there isn't a mechanism for how the parameters are chosen. It's just another mystery that hasn't been solved yet -- so far, to date, God has reliably delegated answers regarding questions about the empirical world to Science.

Comment author: gjm 20 April 2009 09:41:18AM 0 points [-]

I'd start with asking either him or the audience to pick which of his arguments I was going to counter in my reply.

Yes, that's something I've often thought too. (Not only about this particular theist; the practice of throwing up more not-very-good arguments than can be refuted in the time available seems to be commonplace in debates about religious topics. Quite possibly in all debates, but I haven't watched a broad enough sample to know.)

Comment author: NQbass7 20 April 2009 01:50:11PM 0 points [-]

Puevf Unyydhvfg wrote about how he would debate Jvyyvnz Ynar Penvt on his blog. I found it worthwhile.

Comment author: JulianMorrison 20 April 2009 07:48:53AM 1 point [-]

No, winning is good but losing is also useful - we ought to permanently eliminate from the corpus any argument that fails. Even if it wouldn't fail against a blockhead without the intellectual muscle to finesse a counter.

Comment author: ciphergoth 20 April 2009 09:54:39AM 2 points [-]

Losing is a lot more informative if we build on what we learned last time, don't you think?

Comment author: PhilGoetz 20 April 2009 10:31:54PM *  1 point [-]

If you permanently eliminate from the gene pool any genes that aren't currently working efficiently, your ability to evolve is limited.

Comment deleted 20 April 2009 09:50:31AM [-]
Comment author: ciphergoth 20 April 2009 09:57:35AM *  3 points [-]

I disagree; I watched Eliezer vs Adam Frank, and at several points I paused it, trying to work out what I'd say in response to Frank's arguments. I still found that Eliezer got across the counterarguments in a far neater way when I unpaused, and he had a lot less time than I did.

(BTW, after hearing that I also learned how his name is pronounced, so I'm better at spelling it correctly: it's Eli-Ezer, four syllables.)

Comment author: Eliezer_Yudkowsky 20 April 2009 04:29:45PM 0 points [-]

I'd read Frank's book. (And I did try to direct him to the webpages whereby he could have read my stuff.) But I think I could've done it equally well on the fly.

Comment author: Eric 25 April 2009 12:08:05AM 10 points [-]

I've found some of the characterizations of Craig's arguments and debate style baffling.

When he debates the existence of god, he always delivers the same five arguments (technically, it's four: his fifth claim is that god can be known directly, independently of any argument). He develops these arguments as carefully as time allows, and defends each of his premises. He uses the kalam cosmological argument, the fine tuning argument, the moral argument, and the argument from the resurrection of Jesus. This can hardly be characterized as dumping.

Also, his arguments are logically valid; you won't see any, 'brain teaser, therefore god!' moves from him. He's not only a 'theologian'; he's a trained philosopher (he actually has two earned PHDs, one in philosophy and one in theology).

Finally, Craig is at his best when it comes to his responses. He is extremely quick, and is very adept at both responding to criticisms of his arguments, and at taking his opponent's arguments apart.

Debating William Lane Craig on the topic of god's existence without preparation would be as ill advised as taking on a well trained UFC fighter in the octagon without preparation. To extend the analogy further, it would be like thinking it's a good idea because you've won a couple of street fights and want to test yourself.

Comment author: Jack 25 April 2009 12:27:55AM 1 point [-]

I don't think its a good idea either. But the fact that the debate would be on bloggingheads rather than in front of an audience with formal speeches and timed rebuttals definitely helps Eliezer. He's free to ask questions, clarify things etc.

So really its like fighting a UFC fighter in an alley. Not a good idea but I guess you might have a chance.

Comment author: Eliezer_Yudkowsky 25 April 2009 12:47:55AM 4 points [-]

I'd tend to assume that the absence of a moderator makes it easier to abuse the more inexperienced party.

Comment author: Jack 25 April 2009 01:04:02AM 3 points [-]

Well you'd have more experience with the medium. But at a formal debate he'd give 5 five arguments each of which would take your entire speaking time to respond to. On bloggingheads you can ask for his best argument and then spend as much time as you need to on it (or within bloggingheads limits I guess). Also, if you watch formal debates between theists and atheists the participants often avoid answering the difficult questions. In particular, theists always avoid explaining how invoking God doesn't merely obscure and push the question of creation back a step. This medium gives you and opportunity to press things and I like to think that opportunity is an advantage for the side of truth.

Still I'm sure he has an answer to that question. The guy does this for a living, I think even if you prepare it would be a good test of your skills.

Comment author: robirahman 17 March 2016 04:23:53AM *  0 points [-]

Did this debate ever end up happening? If it did, is there a transcript available somewhere?

Edit: Found in another comment that WLC turned down the debate.

Comment author: pozorvlak 22 December 2009 10:52:50AM 9 points [-]

Cultivate a habit of confronting challenges - not the ones that can kill you outright, perhaps, but perhaps ones that can potentially humiliate you.

You may be interested to learn that high-end mountaineers apply exactly the strategy you describe to challenges that might kill them outright. Mick Fowler even states it explicitly in his autobiography - "success every time implies that one's objectives are not challenging enough".

A large part of mountaineering appears to be about identifying the precise point where your situation will become unrecoverable, and then backing off just before you reach it. On the other hand, sometimes you just get unlucky.

Comment author: komponisto 20 April 2009 06:46:56PM *  7 points [-]

A slogan I like is that failure is OK, so long as you don't stop trying to avoid it.

While reading this post, a connection with Beware of Other-Optimizing clicked in my mind. Different aspiring rationalists are (more) susceptible to different failure modes. From Eliezer's previous writings it had generally seemed like he was more worried about the problem of standards (for oneself) that are too low -- that is, not being afraid enough of failure -- than about the opposite error, standards that are too high. But I suspect that's largely specific to him; others may need to worry more about being too afraid of failure. Hence I'm happy to see this post.

Comment author: ciphergoth 20 April 2009 10:59:16AM *  18 points [-]

Unfair debate proposal

You want a debate in which the tables are tilted against you? I see a way to do that which doesn't carry the risks of your current proposal.

A bunch of us get together on an IRC channel and agree to debate you. We thrash out our initial serve; we then spring the topic and our initial serve on you. You must counter immediately, with no time to prepare. We then go away and mull over your counter, and agree a response, which you must again immediately respond to.

We can give ourselves more speaking time than you in each exchange, too, if you want to tilt the tables further (I'm imagining the actual serves and responses being delivered as video).

Comment author: byrnema 20 April 2009 05:41:17PM 7 points [-]

Since Eliezer hasn't prepared by watching earlier debates then one solution could be to just use arguments from the theist's past debates in a simulated debate. As Eliezer prefers, he wouldn't prepare and would have to answer questions immediately.

There are two drawbacks: first it would just be "us" evaluating whether Eliezer performed well (but then, debate performance is always somewhat subjective) and we would lose the interaction of question, response and follow-up question.

Nevertheless, Eliezer's off-the-cuff responses to the theist's past questions could be informative.

Comment author: Eliezer_Yudkowsky 20 April 2009 04:10:20PM 2 points [-]

You're not theists; a handicap is more appropriate if we're going to be debating theology and you taking the positive... but this does sound interesting, so long as we can find a debate position that I agree with but that others are willing to take the negative of.

Comment author: SoullessAutomaton 20 April 2009 09:10:36PM 4 points [-]

I'm pretty sure it's not required that one agree with a position to debate in its favor.

Comment author: dclayh 21 April 2009 11:16:50PM *  0 points [-]

In fact, I have a post kicking around on the subject that it's easier in a debate to defend the side you don't agree with. But perhaps Eliezer also believes this and is looking to further handicap himself :)

Comment author: Vladimir_Nesov 20 April 2009 05:19:50PM *  0 points [-]

This triggered an idea about paranoid debating: require players to submit a preliminary answer in the first few seconds of being presented with the question, then debate.

Comment author: ata 25 April 2011 05:55:42PM 6 points [-]

There are three great besetting sins of rationalists in particular, and the third of these is underconfidence.

Were we ever told the other two?

Comment author: steven0461 25 April 2011 06:17:18PM *  13 points [-]

Yes, by Jeffreyssai:

"Three flaws above all are common among the beisutsukai. The first flaw is to look just the slightest bit harder for flaws in arguments whose conclusions you would rather not accept. If you cannot contain this aspect of yourself then every flaw you know how to detect will make you that much stupider. This is the challenge which determines whether you possess the art or its opposite: Intelligence, to be useful, must be used for something other than defeating itself."

"The second flaw is cleverness. To invent great complicated plans and great complicated theories and great complicated arguments - or even, perhaps, plans and theories and arguments which are commended too much by their elegance and too little by their realism. There is a widespread saying which runs: 'The vulnerability of the beisutsukai is well-known; they are prone to be too clever.' Your enemies will know this saying, if they know you for a beisutsukai, so you had best remember it also. And you may think to yourself: 'But if I could never try anything clever or elegant, would my life even be worth living?' This is why cleverness is still our chief vulnerability even after its being well-known, like offering a Competitor a challenge that seems fair, or tempting a Bard with drama."

"The third flaw is underconfidence, modesty, humility. You have learned so much of flaws, some of them impossible to fix, that you may think that the rule of wisdom is to confess your own inability. You may question yourself so much, without resolution or testing, that you lose your will to carry on in the Art. You may refuse to decide, pending further evidence, when a decision is necessary; you may take advice you should not take. Jaded cynicism and sage despair are less fashionable than once they were, but you may still be tempted by them. Or you may simply - lose momentum."

Comment author: Vladimir_Nesov 24 April 2009 10:19:40AM *  5 points [-]

I skimmed several debates with WLC yesterday, referenced here. His arguments are largely based on one and the same scheme:

  1. Everythng must have a cause
  2. Here's a philosophical paradox for you, that can't be resolved within the world
  3. Since despite the paradox, some fact still holds, it must be caused by God, from outside the world

(Or something like this, the step 3 is a bit more subtle than I made it out to be.) What's remarkable, even though he uses a nontrivial number of paradoxes for the step 2, almost all of them were explicitly explained in the material on Overcoming Bias. At least, I was never confused while listening to his arguments, whereas some of his opponents were, on some of the arguments. I don't see WLC as possessing magical oratorial skills, but he bends the facts on occasion, and is very careful in what he says. Also, his presentations are too debugged to be alive, so it looks unnatural.

The general meta-counterargument would be to break this scheme, as he could present some paradox (e.g. anthropics) without clear known resolution, and through it bend his line. I'm sure he knows lots of paradoxes, so there is a real danger of encountering an unknown one.

He knows Bayesian math. On one occasion, he basically replied to a statement that there is no evidence for God that it's only relevant if you expect more evidence for God if it exists, as opposed to if it doesn't, and if you expect no evidence in both cases, this fact can't be lowered a priori probability. This, of course, contradicts the rest of his arguments, but I guess he'll say that those arguments are some different kind of evidence.

Comment author: ciphergoth 24 April 2009 10:56:42AM 12 points [-]

Many of WLC's arguments have this rough structure:

  • Here's a philosophical brain teaser. Doesn't it make your head spin?
  • Look, with God we can shove the problem under the carpet
  • Therefore, God.

That's why I think that in order to debate him you have to explicitly challenge the idea that God could ever be a good answer to anything; otherwise, you disappear down the rabbit hole of trying to straighten out the philosophical confusions of your audience.

Comment author: MBlume 25 April 2009 02:26:31AM 1 point [-]

"saying 'God' is an epistemic placebo -- it gives you the feeling of a solution without actually solving anything"

something like that?

Comment author: ciphergoth 25 April 2009 10:46:12AM 2 points [-]

Well, you could start with something like that, but you're going to have to set out why it doesn't solve anything. Which I think means you're going to have to make the "lady down the street is a witch; she did it" argument. Making that simple enough to fit into a debate slot is a real challenge, but it is the universal rebuke to everything WLC argues.

Comment author: pnrjulius 12 June 2012 03:32:46AM 2 points [-]

I like to put it this way: Religion is junk food. It sates the hunger of curiosity without providing the sustenance of knowledge.

Comment author: Jack 24 April 2009 01:29:27PM 1 point [-]

If we shouldn't expect evidence in either case then the probability of God's existence is just the prior, right? How could P(God) be above .5? I can't imagine thinking that the existence of an omnipotent, omniscient and benevolent being who answers prayers and rewards and punishes the sins of mortals with everlasting joy or eternal punishment was a priori more likely than not.

I wonder what variety of first cause argument he's making. Even if everything must have a cause that does not mean there is a first cause and the existence of a first cause doesn't mean the first cause is God. Aquinas made two arguments of this variety that actually try to prove the existence of God, but they require outdated categories and concepts to even make.

Comment author: byrnema 24 April 2009 03:27:50PM *  1 point [-]

If God's existence is the prior, I don't think you include that he is also an "omnipotent, omniscient and benevolent being, [...]". Those are things you deduce about him after. The way I've thought about it is let X =whatever the explanation is to the creation conundrum. We will call X "God". X exists trivially (by definition), can we then infer properties about X that would justify calling it God? In other words, does the solution to creation have to be something omniscient and benevolent? (This is the part which is highly unlikely.)

Comment author: pnrjulius 12 June 2012 03:34:48AM 1 point [-]

If you call X "God" by definition, you may find yourself praying to the Big Bang, or to mathematics.

There is a mysterious force inherent in all matter and energy which binds the universe together. We call it "gravity", and it obeys differential equations.

Comment author: byrnema 17 June 2012 05:56:36AM 0 points [-]

If you call X "God" by definition, you may find yourself praying to the Big Bang, or to mathematics.

The Big Bsng and mathematics are good candidates. I've considered them. It only sounds ridiculous because you mentioned praying to them. The value of 'praying to X' is again something you need to deduce, rather than assume.

We call it "gravity", and it obeys differential equations.

Nah, gravity isn't universal or fundamental enough. That is, I would be very surprised if it was a 'first cause' in any way.

Comment author: Eliezer_Yudkowsky 24 April 2009 06:48:04PM 1 point [-]

You certainly should not call X "God", nor should you suppose that X has the property "existence" which is exactly that which is to be rendered non-confusing.

Comment author: byrnema 25 April 2009 02:19:26AM 1 point [-]

I just read your posts about the futility of arguing "by definition". I suspect that somewhere there is where my error lies.

More precisely, could you clarify whether I "shouldn't" do those things because they are "not allowed" or because they wouldn't be effective?

Comment author: MBlume 25 April 2009 02:30:22AM *  9 points [-]

You shouldn't because even though when you speak the word "God" you simply intend "placeholder for whatever eventually solves the creation conundrum," it will be heard as meaning "that being to which I was taught to pray when I was a child" -- whether you like it or not, your listener will attach the fully-formed God-concept to your use of the word.

Comment author: byrnema 25 April 2009 02:41:26AM *  2 points [-]

Got it. if X is the placeholder for whatever eventually solves the creation conundrum, there's no reason to call it anything else, much less something misleading.

Comment author: MBlume 25 April 2009 04:21:27AM 0 points [-]

precisely =)

Comment author: JulianMorrison 25 April 2009 02:45:35PM 0 points [-]

In fact even naming it X is a bit of a stretch, because "the creation conundrum" is being assumed here, but my own limited understanding of physics suggests this "conundrum" itself is a mistake. What a "cause" really is, is something like: the information about past states of the universe embedded in the form of the present state. But the initial state doesn't have embedded information, so it doesn't really have either a past or a cause. As far as prime movers go, the big bang seems to be it, sufficient in itself.

Comment author: byrnema 25 April 2009 03:11:10PM *  0 points [-]

Yes, I agree with you: there is no real conundrum. In the past, we've solved many "conundrums" (for example, Zeno's paradox and the Liar's Paradox). By induction, I believe that any conundrum is just a problem (often a math problem) that hasn't been solved yet.

While I would say that the solution to Zeno's paradox "exists", I think this is just a semantic mistake I made; a solution exists in a different way than a theist argues that God exists. (This is just something I need to work on.)

Regarding the physics: I understand how a state may not causally depend upon the one proceeding (for example, if the state is randomly generated). I don't understand (can't wrap my head around) if that means it wasn't caused... it still was generated, by some mechanism.

Comment author: Vladimir_Nesov 25 April 2009 11:51:11AM 1 point [-]

More precisely, could you clarify whether I "shouldn't" do those things because they are "not allowed" or because they wouldn't be effective?

You shouldn't do something not directly because it's not allowed, but for the reason it's not allowed.

Comment author: byrnema 25 April 2009 02:00:02PM *  -2 points [-]

This comment is condescending and specious.

Comment author: Vladimir_Nesov 25 April 2009 02:18:34PM *  0 points [-]

That comment was meta. It isn't condescending, as it's not about you.

Comment author: byrnema 25 April 2009 02:41:18PM 0 points [-]

It's about me because you imply that I don't already know what you're saying, and I could benefit from this wise advice.

Comment author: Vladimir_Nesov 25 April 2009 02:48:16PM *  0 points [-]

If someone speaks the obvious, then it's just noise, no new information, and so the speaker should be castigated for destructive stupidity. Someone or I.

Comment author: Jack 24 April 2009 05:26:45PM 0 points [-]

You could do it that way but then the question is just the priors for the probability that X has those traits. You can't say. "It would be a lot easier for God to do all of the things we think he needs to do if he was omnipotent therefore it is more likely that God is omnipotent. Adding properties to God that increase His complexity have to decrease the probability that He exists otherwise we're always going to be ascribing super powers to the entities we posit since they never make it harder for those entities to accomplish the tasks we need them to. Now I suppose if you could deduce that God has those traits then you would be providing evidence that X had those traits with a probability of 1. Thats pretty remarkable but anyone is free to have at it.

So either you're putting a huge burden on your evidence to prove that there is some X such that X has these traits OR you have to start out with an extremely low prior.

Comment author: handoflixue 23 April 2011 12:19:22AM 0 points [-]

For some reason, the idea that P(God) = 0.5 exactly amuses me. Thank you for the smile :)

Comment author: LukeStebbing 23 April 2011 01:10:53AM *  3 points [-]

It reminded me of one of my formative childhood books:

What is the probability there is some form of life on Titan? We apply the principle of indifference and answer 1/2. What is the probability of no simple plant life on Titan? Again, we answer 1/2. Of no one-celled animal life? Again, 1/2.

--Martin Gardner, Aha! Gotcha

He goes on to demonstrate the obvious contradiction, and points out some related fallacies. The whole book is great, as is its companion Aha! Insight. (They're bundled into a book called Aha! now.)

Comment author: Vladimir_Nesov 24 April 2009 04:33:31PM 0 points [-]

If we shouldn't expect evidence in either case then the probability of God's existence is just the prior, right? How could P(God) be above .5? I can't imagine thinking that the existence of an omnipotent, omniscient and benevolent being who answers prayers and rewards and punishes the sins of mortals with everlasting joy or eternal punishment was a priori more likely than not.

Contradiction: answered prayers is lots of evidence.

Comment author: Jack 24 April 2009 05:16:18PM 0 points [-]

I looking at the concept of God and trying to guess what the priors would be for a being that meets that description. That description usually includes answering prayers. If there is evidence of answered prayers then we might want to up the probability of God's existence- but a being capable of doing that is going to be some complex that extraordinary evidence is necessary to come to the conclusion one exists.

Comment author: William 24 April 2009 04:40:19PM 0 points [-]

Only if you have some sort of information about the unanswered prayers.

Comment author: pangloss 24 April 2009 02:20:46PM 0 points [-]

Given the problems for the principle of indifference, a lot of bayesians favor something more "subjective" with respect to the rules governing appropriate priors (especially in light of Aumann-style agreement theorems).

I'm not endorsing this manuever, merely mentioning it.

Comment author: AndySimpson 20 April 2009 11:13:56PM 4 points [-]

gjm asks wisely:

What would you think of a musician who decided to give a public performance without so much as looking at the piece she was going to play? Would you not be inclined to say: "It's all very well to test yourself, but please do it in private"?

The central thrust of Eliezer's post is a true and important elaboration of his concept of improper humility, but doesn't it overlook a clear and simple political reality? There are reputational effects to public failure. It seems clear that those reputational effects often outweigh whatever utility is gained from an empirical "test" of one's own abilities: this is why international relations theory isn't a rigorous empirical science. We live in an irrational kaleidescope of power, driven by instinct and emotion, ordered only fleetingly by rhetoric and guile. In this situation, we need to keep our cards close to our chest if we want to win.

Mulciber adds something along the same lines:

By increasing the challenge the way you suggest, you may very well be acting rationally toward the goal of testing yourself, but you're not doing all you can to cut the opponent. To rationally pursue winning the debate, there's no excuse for not doing your research.

And Eliezer does seem to approve of this mode of thinking in some cases:

Of course this is only a way to think when you really are confronting a challenge just to test yourself, and not because you have to win at any cost. In that case you make everything as easy for yourself as possible. To do otherwise would be spectacular overconfidence, even if you're playing tic-tac-toe against a three-year-old.

So, to sum up my concern, how is this principle of pragmatism reconciled to your choice not to prepare? Isn't it best to test yourself in the peace and safety of your dojo, or in circumstances where the stakes are not high, and use every means available to resist on the actual field of battle?

Comment author: noahlt 20 April 2009 07:30:29AM 4 points [-]

What is the danger of overconfidence?

Passing up opportunities. Not doing thing you could have done, but didn't try (hard enough).

Did you mean "danger of underconfidence"?

Comment author: Eliezer_Yudkowsky 20 April 2009 04:17:08PM 2 points [-]

Yes. Fixed. Thanks.

Apparently "danger of overconfidence" is cached in my mind to the point that even when the whole point of the article is the opposite, it still comes out that way. Case in point!

Comment author: Mulciber 20 April 2009 07:54:09PM 12 points [-]

It sounds as though you're viewing the debate as a chance to test your own abilities at improvisational performance. That's the wrong goal. Your goal should be to win.

"The primary thing when you take a sword in your hands is your intention to cut the enemy, whatever the means. Whenever you parry, hit, spring, strike or touch the enemy’s cutting sword, you must cut the enemy in the same movement. It is essential to attain this. If you think only of hitting, springing, striking or touching the enemy, you will not be able actually to cut him. More than anything, you must be thinking of carrying your movement through to cutting him."

By increasing the challenge the way you suggest, you may very well be acting rationally toward the goal of testing yourself, but you're not doing all you can to cut the opponent. To rationally pursue winning the debate, there's no excuse for not doing your research.

In choosing not to try for that, you'll end up sending the message that rationalists don't play to win. You and I know this isn't quite accurate -- what you're doing is more like a rationalist choosing to lose a board game, because that served some other, real purpose of his -- but that is still how it will come across. Do you consider this to be acceptable?

Comment author: Peter_de_Blanc 21 April 2009 01:15:04AM 1 point [-]

This isn't about choosing to lose. It's more about exploration vs. exploitation. If you always use the strategy you currently think is the best, then you won't get the information you need to improve.

Comment author: Mulciber 21 April 2009 02:36:46AM 1 point [-]

That seems contradictory. If you actually thought that always using one strategy would have this obvious disadvantage over another course of action, then doing so would by definition not be "the strategy you currently think is best."

Comment author: JamesAndrix 21 April 2009 03:30:10PM 4 points [-]

Experiments can always be framed as a waste of resources.

There is always something you're using up that you could put to direct productive use, even if it's just your time.

Comment author: andrewc 22 April 2009 10:59:13AM 1 point [-]

The potential information you gain from the experiment is a currency. Discount that currency (or have a low estimate of it) and yeah you can frame the experiment as a waste of resources.

Comment author: jimmy 21 April 2009 06:45:03AM 1 point [-]

You're confusing meta strategies and strategies. The best meta strategy might be implementing strategies that do not have the highest chance of succeeding, simply because you can use the information you gain to choose the actual best strategy when it matters.

Consider the case where you're trying to roll a die many times and get the most green sides coming up, and you can choose between a die that has 3 green sides, and one that probably (p = 0.9) has 2 green sides, but might (p = 0.1) have 4 green sides. If the game lasts 1 roll, you chose the first die. If the game lasts many many rolls, you chose the other die until you're convinced that it only has 2 green sides- even though this is expected to lose in the short term.

Comment author: Mulciber 21 April 2009 10:06:51PM 0 points [-]

Both those courses of action with dice sound like strategies to me, not meta strategies. Could you give another example of something you'd consider a meta strategy?

I think there's a larger point lurking here, which is that a good strategy should, in general, provide for gathering information so it can adapt. Do you agree?

Comment deleted 21 April 2009 10:37:47PM *  [-]
Comment author: Mulciber 21 April 2009 11:47:27PM *  0 points [-]

That does indeed help. Thank you.

So really, a meta strategy would be something like choosing your deck for a Magic tournament based on what types of decks you expect your opponents to use. While the non-meta strategy would be your efforts to win within a game once it's started.

Comment author: MrHen 22 April 2009 12:05:26AM 0 points [-]

Ah, crap. Was that my comment? Sorry. I keep deleting comments when it looks like no one has responded.

But, yeah, Magic has a rather intense meta-game. The reason I deleted my comment was because I realized I had no idea where the meta-strategy was in the dice example so I assumed I missed something. I could be chasing down the wrong definition.

Comment author: orthonormal 22 April 2009 04:58:43AM *  6 points [-]

Ah, crap. Was that my comment? Sorry. I keep deleting comments when it looks like no one has responded.

...and that's why you really shouldn't delete a comment unless you think it's doing great harm. You may be worrying a bit too much about what others here think about every comment you make, when it's in fact somewhat random whether anyone replies to a given comment.

Comment author: Eliezer_Yudkowsky 22 April 2009 05:25:30AM 0 points [-]

Also, I believe that deleting a comment does not dissipate any negative karma that it has already earned you.

Comment author: andrewc 22 April 2009 10:55:31AM 3 points [-]

Did you write a cost function down for the various debate outcomes? The skew will inform whether overconfidence or underconfidence should be weighted differently.

Comment author: pangloss 21 April 2009 01:03:47AM 3 points [-]

Eliezer, does your respect for Aumann's theorem incline you to reconsider, given how many commenters think you should thoroughly prepare for this debate?

Comment author: Eliezer_Yudkowsky 21 April 2009 01:44:06AM 2 points [-]

Actually, the main thing that moved me was the comment about Richard Carrier also losing. I was thinking mostly that Hitchens had just had a bad day. Depending on how formidable the opponent is, it might still be a test of my ability even if I prepare.

Comment author: ciphergoth 21 April 2009 06:15:40AM 2 points [-]

Carrier lost by his own admission, on his home territory.

I've given a lot of thought to how I'd combat what he says, and what I think it comes down to is that standard, "simple" atheism that says "where is your evidence" isn't going to work; I would explicitly lead with the fact that religious language is completely incoherent and does not constitute an assertion about the world at all, and so there cannot be such a thing as evidence for it. And I would anticipate the way he's going to mock it by going there first: "I'm one of those closed-minded scientists who says he'll ignore the evidence for Jesus". At least when I play the debate out in my head, this is always where we end up, and if we start there I can deny him some cheap point scoring.

Comment author: Simey 21 April 2009 02:39:07PM 0 points [-]

"I'm one of those closed-minded scientists who says he'll ignore the evidence for Jesus"

He would probably answer that it is not scientific to ignore evidence. Miracles cannot be explained by science. But they could - theoretically - be proven with scientific methods. If someone claims to have a scientific proof of a miracle (for example a video), it would be unscientific to just ignore it, wouldn't it?

Comment author: ciphergoth 21 April 2009 03:47:54PM 0 points [-]

The idea is that you would open with this, but go on to explain why there could not be such a thing as evidence, because what is being asserted isn't really an assertion at all.

Comment author: AllanCrossman 21 April 2009 04:03:17PM 5 points [-]

I can't agree with the idea that religious assertions aren't really assertions.

A fairly big thing in Christianity is that Jesus died, but then two or three days later was alive and well. This is a claim about how the world is (or was). It's entirely conceivable that there could be evidence for such a claim. And, in fact, there is evidence - it's just not strong enough evidence for my liking.

Comment author: pangloss 21 April 2009 06:27:11AM 0 points [-]

I don't think making a move towards logical positivism or adopting a verificationist criterion of meaning would count as a victory.

Comment author: ciphergoth 21 April 2009 06:40:29AM 0 points [-]

You don't have to do either of those things, I don't think. Have a look at the argument set out in George H Smith's "Atheism: the Case against God".

Comment author: pangloss 21 April 2009 02:35:36PM 1 point [-]

I didn't think that one had to. That is what your challenge to the theist sounded like. I think that religious language is coherent but false, just like phlogiston or caloric language.

Denying that the theist is even making an assertion, or that their language is coherent is a characteristic feature of positivism/verificationism, which is why I said that.

Comment author: ciphergoth 21 April 2009 03:49:03PM 0 points [-]

No, I think it extends beyond that - see eg No Logical Positivist I

Comment author: RobinHanson 20 April 2009 05:42:57PM 6 points [-]

We have lots of experimental data showing overconfidence; what experimental data show a consistent underconfidence, in a way that a person could use that data to correct their error? This would be a lot more persuasive to me than the mere hypothetical possibility of underconfidence.

Comment author: timtyler 20 April 2009 06:20:35PM 3 points [-]

Underconfidence is surely very common in the general population. It's usually referred to "shyness", "tentativeness", "depression" - or by other names besides "underconfidence". This is part of the audience of the self-help books that encourage people to be more confident.

E.g. see: "The trouble with overconfidence." on PubMed.

Comment author: timtyler 20 April 2009 07:45:41PM 0 points [-]

For underconfidence and depression, see:

"Depressive cognition: a test of depressive realism versus negativity using general knowledge questions." on PubMed.

Underconfidence in visual perceptual judgments:

"The role of individual differences in the accuracy of confidence judgments." on PubMed.

More on that, see:

"Realism of confidence in sensory discrimination." on PubMed.

Comment author: Eliezer_Yudkowsky 20 April 2009 05:57:28PM 0 points [-]

I believe there were some nice experiments having to do with overcorrection, and I believe those were in "Heuristics and Biases" (the 2003 volume), but I'm on a trip right now and away from my books.

Comment author: PhilGoetz 20 April 2009 03:51:31PM 8 points [-]

And so I wrote at once to the Bloggingheads folks and asked if they could arrange a debate. This seemed like someone I wanted to test myself against. Also, it was said by them that Christopher Hitchens should have watched the theist's earlier debates and been prepared, so I decided not to do that, because I think I should be able to handle damn near anything on the fly, and I desire to learn whether this thought is correct; and I am willing to risk public humiliation to find out.

This really bothers me, because you weren't just risking your own public humiliation; you were risking our public humiliation. You were endangering an important cause for your personal benefit.

Comment author: Annoyance 20 April 2009 06:54:12PM 4 points [-]

The cause of rationalism does not rise and fall with Eliezer Yudkowsky.

If you fear the consequences of being his partisan, don't align yourself with his party. If you are willing to associate yourself and your reputation with him, accept the necessary consequences of having done so.

Comment author: Jack 20 April 2009 09:36:51PM 8 points [-]

Phil might be wrong to phrase his objection in terms of "our public humiliation". But its still the case that there are things at stake beyond Eliezer Yudkowsky's testing himself. And those are things we all care about.

Comment author: Eliezer_Yudkowsky 20 April 2009 04:26:50PM 2 points [-]

I've done a service or two to atheism, and will do more services in the future, and those may well depend on this test of calibration.

Comment author: PhilGoetz 20 April 2009 04:56:56PM 0 points [-]

I realize it is a tradeoff.

Comment author: pangloss 20 April 2009 06:40:14AM 4 points [-]

This post reminds me of Aristotle's heuristics for approaching the mean when one tends towards the extremes:

"That moral virtue is a mean, then, and in what sense it is so, and that it is a mean between two vices, the one involving excess, the other deficiency, and that it is such because its character is to aim at what is intermediate in passions and in actions, has been sufficiently stated. Hence also it is no easy task to be good. For in everything it is no easy task to find the middle, e.g. to find the middle of a circle is not for every one but for him who knows; so, too, any one can get angry- that is easy- or give or spend money; but to do this to the right person, to the right extent, at the right time, with the right motive, and in the right way, that is not for every one, nor is it easy; wherefore goodness is both rare and laudable and noble.

Hence he who aims at the intermediate must first depart from what is the more contrary to it, as Calypso advises-

Hold the ship out beyond that surf and spray.

For of the extremes one is more erroneous, one less so; therefore, since to hit the mean is hard in the extreme, we must as a second best, as people say, take the least of the evils; and this will be done best in the way we describe. But we must consider the things towards which we ourselves also are easily carried away; for some of us tend to one thing, some to another; and this will be recognizable from the pleasure and the pain we feel. We must drag ourselves away to the contrary extreme; for we shall get into the intermediate state by drawing well away from error, as people do in straightening sticks that are bent." (NE, II.9)

Comment author: thomblake 20 April 2009 04:22:09PM 0 points [-]

I agree - overcorrecting in action might well be a good technique for simply correcting virtue. A coward might do well by being brash for a bit, to settle in at courage after the fact.

Comment author: Wei_Dai 28 September 2012 07:38:03PM 2 points [-]

Can anyone give some examples of being underconfident, that happened as a result of overcorrecting for overconfidence?

Comment author: wmorgan 28 September 2012 08:56:00PM 3 points [-]

I'll give it a shot.

In poker you want to put more money in the pot with strong hands, and less money with weaker ones. However, your hand is secret information, and raising too much "polarizes your range," giving your opponents the opportunity to outplay you. Finally, hands aren't guaranteed -- good hands can lose, and bad hands can win. So you need to bet big, but not too big, with your good hands.

So my buddy and I sit down at the table, and I get dealt a few strong hands in a row, but I raise too big with them -- I'm overconfident -- so I win a couple of small pots, and lose a big one. My buddy whispers to me, "dude...you're overplaying your hands..." Ten minutes later I get dealt another good hand, and I consider his advice, but now I bet too small, underconfident, and miss out on value.

Replace the conversation with an internal monologue, and this is something you see all the time at the poker table. Once bitten, twice shy and all that.

Comment author: komponisto 28 September 2012 08:52:59PM 1 point [-]

My "revision" to my Amanda Knox post is one. I was right the first time.

Comment author: Wei_Dai 28 September 2012 09:20:54PM 1 point [-]

How did you end up concluding that your original confidence level was correct after all?

Comment author: komponisto 28 September 2012 10:31:02PM 0 points [-]

I realized that there was a difference between the information I had and the information most commenters had; also that I had underestimated my Bayesian skills relative to the LW average, so that my panicked reaction to what I perceived as harsh criticism in a few of the comments was an overreaction brought about by insecurity.

Comment author: Wei_Dai 01 October 2012 06:43:08AM 1 point [-]

I'm afraid I can't accept your example at this point, because based on my priors and the information I have at hand (the probability of guilt that you gave was 10x lower than the next lowest estimate, it doesn't look like you managed to convince anyone else to adopt your level of confidence during the discussions, absence of other evidence indicating that you have much better Bayesian skills than the LW average), I have to conclude that it's much more likely that you were originally overconfident, and are now again.

Can you either show me that I'm wrong to make this conclusion based on the information I have, or give me some additional evidence to update on?

Comment author: mfb 28 September 2012 10:47:21PM 0 points [-]

Interesting posts.

However, I disagree with your prior by a significant amount. The probability that [person in group] commits a murder within one year is small, but so is the probability that [person in group] is in contact with a victim. I would begin with the event [murder has happened], assign a high probability (like ~90%) to "the murderer knew the victim", and then distribute those 90% among people who knew her (and work with ratios afterwards). I am not familiar enough with the case to do that know, but Amanda would probably get something around 10%, before any evidence or (missing) motive is taken into account.

Comment author: shokwave 29 September 2012 12:19:04AM 0 points [-]

assign a high probability (like ~90%)

A cursory search suggests 54% is more accurate. source, seventh bullet point. Also links to a table that could give better priors.

Comment author: Nornagest 29 September 2012 01:42:44AM *  0 points [-]

I'm reading that as 54% plus some unknown but probably large proportion of the remainder: that includes a large percentage in which the victim's relationship to the perpetrator is unknown, presumably due to lack of evidence. Your link gives this as 43.9%, but that doesn't seem consistent with the table.

If you do look at the table, it says that 1,676 of 13,636 murders were known to be committed by strangers, or about 12%; the unknowns probably don't break down into exactly the same categories (some relationships would be more difficult to establish than others), but I wouldn't expect them to be wildly out of line with the rest of the numbers.

Comment author: mfb 29 September 2012 12:46:24PM *  0 points [-]

I agree with that interpretation. The 13636 murders contain:
*1676 from strangers
*5974 with some relation
*5986 unknown

Based on the known cases only, I get 22% strangers. More than expected, but it might depend on the region, too (US <--> Europe). Based on that table, we can do even better: We can exclude reasons which are known to be unrelated to the specific case, and persons/relations which are known to be innocent (or non-existent). A bit tricky, as the table is "relation murderer -> victim" and not the other direction, but it should be possible.

Comment author: HughRistik 22 April 2009 11:02:39PM 2 points [-]

Eliezer said:

So if you have learned a thousand ways that humans fall into error and read a hundred experimental results in which anonymous subjects are humiliated of their overconfidence - heck, even if you've just read a couple of dozen - and you don't know exactly how overconfident you are - then yes, you might genuinely be in danger of nudging yourself a step too far down.

I also observed this phenomenon of debiasing being over-emphasized in discussions of rationality, while heuristic is treated as a bad word. I tried to get at the problem of passing up opportunities you mention when I said in my post on heuristic: "It's a mistake in cartography to have areas of your map that are filled in wrong, but it's also a mistake to have areas on your map blank that you could have filled in, at least with something approximate".

I think we need more success stories of human heuristic. Currently, the glut of information on biases and faulty heuristic is making these more cognitively available, leading to underconfidence.

Of course, it's easier to measure the gravity of mistakes of overconfidence, because we know the bad outcome, and we can speculate that it would have been avoided without the overconfidence. Yet in the case of mistakes of underconfidence, we don't know what we are missing out on, what brilliant theories were prematurely discarded, and what groundbreaking inventions were never created, because the creators (or their colleagues, investors, advisors, professors, whoever) were underconfident.

Yet we can look at examples of great discoveries, ideas, solutions, practices, and what what our lives, or the world, would be like if they had got nipped in the bud. Furthermore, there may be cases where two people (say, scientists or entrepreneurs) were both acquainted with the same evidence or theory, yet only one was confident enough about it to capitalize on it.

Comment author: RichardKennaway 20 April 2009 12:49:38PM *  3 points [-]

There is a children's puzzle which consists of 15 numbered square blocks arranged in a frame large enough to hold 16, four by four, leaving one empty space. You can't take the blocks out of the frame. You can only slide a block into the empty space from an adjacent position. The puzzle is to bring the blocks into some particular arrangement.

The mathematics of which arrangements are accessible from which others is not important here. The key thing is that no matter how you move the blocks around, there is always an empty space. Wherever the space is, you can always move a block into it, but however fast you move the blocks, they never fill the frame.

I have not heard of this theologian before, but ciphergoth's description of him firing off piles of pet arguments faster than you can point to the holes does suggest the sliding block metaphor, but he's playing with a much larger set of blocks. At any rate, it is utterly unlike the sedate, civilised pursuit of truth observed on Bloggingheads. It is theatre addressed to the audience, not the antagonist. As far as he is concerned, you would just be one of his supporting cast.

If a debate is arranged, I second the advice to prepare as well as possible, attending not just to the specific arguments he uses and how others fared against them, but also the theatrics. It may help that Bloggingheads does not have a live audience. I am 90% sure that if he agrees to debate you, he will ask to do so in front of one.

Comment author: Eliezer_Yudkowsky 20 April 2009 08:01:29PM 3 points [-]

And conversely, as Ari observes:

If you’ve never hit the ground while skydiving, you’re opening your parachute too early.

Comment author: outlawpoet 21 April 2009 10:12:46PM 2 points [-]

er, am I misparsing this?

It seems to me that if you haven't hit the ground while skydiving, you're some sort of magician, or you landed on an artificial structure and then never got off..

Comment author: AlexU 20 April 2009 02:28:31PM *  3 points [-]

Can someone explain why we can't name the theist in question, other than sheer silliness?

Comment author: Eliezer_Yudkowsky 20 April 2009 04:25:28PM 5 points [-]

Because I consider it unfair to him to talk about a putative debate before he's replied to a request; also somewhat uncourteous to talk about how I plan to handicap myself (especially if it's not a sign of contempt but just a desire to test myself). If people can work it out through effort, that's fine, I suppose, but directly naming him seems a bit discourteous to me. I have no idea whether he's courteous to his opponents outside debate, but I have no particular info that he isn't.

Comment author: AlexU 20 April 2009 04:32:59PM 1 point [-]

How is it unfair to him in any way? He's free to choose whether to debate or not debate you; I doubt any reasonable person would be offended by the mere contemplation of a future debate. And any sort of advantage or disadvantage that might be gained or lost by "tipping him off" could only be of the most trivial sort, the kind any truth-seeking person should best ignore. All this does is make it a bit difficult to talk about the actual substance and ideas underlying the debate, which seems to me the most important stuff anyway.

Comment author: PhilGoetz 20 April 2009 04:58:20PM 0 points [-]

I think Eliezer's reason is good. It would sound like contempt to the More Wrong.

Comment author: Nominull 20 April 2009 05:35:59PM 2 points [-]

Playing tic-tac-toe against a three-year-old for the fate of the world would actually be a really harrowing experience. The space of possible moves is small enough that he's reasonably likely to force a draw just by acting randomly.

Comment author: somervta 06 February 2013 04:08:26AM 0 points [-]

Not if you can go first.

Comment author: Luke_A_Somers 08 May 2013 03:33:49PM 3 points [-]

So, you go center.

If he goes on a flat side, you're golden (move in a nearly-opposite corner, you can compel victory).

If he goes in a corner, you go 90° away. Now, if he's really acting randomly, he has a 1/6 chance to block your next-turn win.

Then you block his win threat, making a new threat of your own, that he has a 1/4 chance to block. If he does, he'll make the last block half the time. So, a 1/96 chance to tie by moving randomly.

That would be enough to make me nervous if the fate of the world were at stake. Would you like to play Global Thermonuclear War?

Comment author: somervta 08 May 2013 11:38:21PM 1 point [-]

1/96 (I was thinking of a different algorithm, but the probability is the same) would be enough to make me nervous, but I wouldn't call it 'reasonably likely'

Comment author: hbarlowe 21 April 2009 12:05:36AM -1 points [-]

...the third of these is underconfidence. Michael Vassar regularly accuses me of this sin, which makes him unique among the entire population of the Earth.

Well, that sure is odd. Guess that's why Vassar was promoted. It makes sense now.

Anyway, EY's history doesn't seem to me marked by much underconfidence. For example, his name has recently been used in vain at this silly blog, where they're dredging up all sorts of amusing material that seems to support the opposite conclusion.

Since I know EY has guru status around here, please don't jump down my throat. For the record, I agree with everything he says. I must, for the force of his rationality encircles me and compels me.

Anyway, for those who don't want to follow the link, here's the best part -- a bit of pasted materials in a comment by someone named jimf:


When Ayn [Rand] announced proudly, as she often did, 'I can account for every emotion I have' -- she meant, astonishingly, that the total contents of her subconscious mind were instantly available to her conscious mind, that all of her emotions had resulted from deliberate acts of rational thought, and that she could name the thinking that had led her to each feeling. And she maintained that every human being is able, if he chooses to work at the job of identifying the source of his emotions, ultimately to arrive at the same clarity and control.


Barbara Branden, The Passion of Ayn Rand pp. 193 - 195

From a transhumanist acquaintance I once corresponded with:

Jim, dammit, I really wish you'd start with the assumption that I have a superhuman self-awareness and understanding of ethics, because, dammit, I do.

Comment author: orthonormal 21 April 2009 12:47:58AM 2 points [-]

With detractors like this, who needs supporters? I almost wonder whether razib wrote that blog post in one of his faux-postmodernist moods.

I advise you all not to read it; badly written and badly supported criticism of EY is too powerful of a biasing agent towards him.

Comment author: Jonathan_Graehl 21 April 2009 06:39:00AM *  0 points [-]

This is a brutal oversimplification, but it seems to me, roughly speaking, that in mis-identifying fundamentalism with the humanities, they tend to advocate a reductionism that re-writes science itself in the image of a priestly authoritarianism with too much in common with the very fundamentalisms they claim to disdain (and rightly so).

The author understandably distances himself from his own output, reminiscent of the passages ridiculed in "Politics and the English Language".

Comment author: FrF 20 April 2009 10:57:45AM *  1 point [-]

Eliezer should write a self-help book! Blog posts like the above are very inspiring to this perennial intellectual slacker and general underachiever (meaning: me).

I certainly can relate to this part:

"It doesn't seem worthwhile any more, to go on trying to fix one thing when there are a dozen other things that will still be wrong...

There's not enough hope of triumph to inspire you to try hard..."

Comment author: CronoDAS 20 April 2009 09:33:45PM *  1 point [-]

Slightly off-topic:

I don't know if it would be possible to arrange either of them, but there are two debates I'd love to see Eliezer in:

A debate with Amanda Marcotte on evolutionary psychology

and

A debate with Alonzo Fyfe on meta-ethics.

Comment author: HughRistik 22 April 2009 11:46:33PM *  2 points [-]

A debate with Amanda Marcotte on evolutionary psychology

Before anyone even thinks about this, they need to read Gender, Nature, and Nurture by Richard Lippa. He creates a hypothetical debate between Nature and Nurture which is very well done. Nurture has a bunch of arguments that sound "reasonable" and will be persuasive to audiences who are either close-minded or unfamiliar with the research literature, yet are actually sophistry. I would recommend having at least some sort of an answer to all of those points.

Defending evolutionary psychology in a debate is going to be very hard, because the playing field is so stacked. It's really easy to get nailed by skeptical sophistry or defeated by a King on the Mountain. In this case, the King would be arguing something like "male-female differences are socially constructed."

Appreciating the argument of evolutionary psychology, like evolution itself, requires thinking holistic and tying a lot of arguments and evidence together. This is difficult in a verbal debate, where a skilled sophist will take your statements and evidence in isolation and ridicule them without giving you a change to link them together into a bigger picture:

The amount of information that the King considers at one time is very small: one statement. He makes one decision at a time. He then moves on to the next attempted refutation, putting all previous decisions behind him. The broad panorama--of mathematical, spatial, and temporal relationships between many facts--that makes up the pro-evolution argument, which need to be viewed all at once to be persuasive, cannot get in, unless someone finds a way to package it as a one-step-at-a-time argument (and the King has patience to hear it). Where his opponent was attempting to communicate just one idea, the King heard many separate ideas to be judged one by one.

Comment author: Annoyance 20 April 2009 10:02:00PM -1 points [-]

Overconfidence is usually costlier than underconfidence. The cost to become completely accurate is often greater than the benefit of being slightly-inaccurate-but-close-enough.

When these two principles are taken into account, underconfidence becomes an excellent strategy. It also leaves potential in reserve in case of emergencies. As being accurately-confident tends to let others know what you can do, it's often desirable to create a false appearance.

Comment author: Douglas_Knight 21 April 2009 03:44:07AM 4 points [-]

The cost of underconfidence is an opportunity cost. This is easy to miss, so it will be underweighted--salience bias. This is not a rebuttal, but it is a reason to expect people will falsely conclude that overconfidence is costlier.

Comment author: Annoyance 21 April 2009 01:43:06PM 0 points [-]

I approve of your response, Douglas_Knight, but think that it is both incomplete and somewhat inaccurate.

The cost of underconfidence isn't necessarily or always an opportunity cost. It can be so, yes. But it can also be not so. You are making a subtle and mostly implicit claim of universality regarding an assertion that is not universally the case.

A strategy doesn't need to work in every possible contingency to be useful or valid.

Comment author: mattnewport 20 April 2009 10:09:04PM *  1 point [-]

Overconfidence is usually costlier than underconfidence.

I suspect you are overconfident in that belief. Simply stating something is not a persuasive argument.

Comment author: Annoyance 20 April 2009 10:25:11PM -1 points [-]

"Simply stating something is not a persuasive argument."

Is simply stating that supposed to be persuasive?

Sooner or later we have to accept or reject arguments on their merits, and that requires evaluating their supports. Not demanding supports for them.

Comment author: mattnewport 20 April 2009 10:39:47PM 2 points [-]

Overconfidence and underconfidence both imply a non-optimal amount of confidence. It's a little oxymoronic to claim that underconfidence is an excellent strategy - if it's an excellent strategy then it's presumably not underconfidence. I assume what you are actually claiming is that in general most people would get better results by being less confident than they are? Or are you claiming that relative to accurate judgements of probability of success it is better to consistently under rather than over estimate?

You claim that overconfidence is usually costlier than underconfidence. There are situations where overconfidence has potentially very high cost (overconfidently thinking you can safely overtake on a blind bend perhaps) but in many situations the costs of failure are not as severe as people tend to imagine. Overconfidence (in the sense of estimating greater probability of success than is accurate) can usefully compensate for over estimating the cost of failure in my experience.

You seem to have a pattern of responding to posts with unsupported statements that appear designed more to antagonize than to add useful information to the conversation.

Comment author: MrHen 20 April 2009 11:06:04PM 0 points [-]

I am replying here instead of higher because I agree with mattnewport, but this is addressed to Annoyance. It is hard to for me to understand what you mean by your post because the links are invisible and I did not instinctively fill them in correctly.

Overconfidence is usually costlier than underconfidence.

As best as I can tell, this is situational. I think mattnewport's response is accurate. More on this below.

The cost to become completely accurate is often greater than the benefit of being slightly-inaccurate-but-close-enough.

It seems that the two paths from this statement are to stay inaccurate or start getting more efficient at optimizing your accuracy. It sounds too similar to saying, "It is too hard. I give up," for me to automatically choose inaccuracy. I want to know why it is so hard to become more accurate.

It also seems situational in the sense that it is not always, just often. This is relevant below.

When these two principles are taken into account, underconfidence becomes an excellent strategy.

In addition to mattnewport's comment about underconfidence implying non-optimal confidence, I think that building this statement on two situational principles is dangerous. Filling out the (situational) blanks leads to this statement:

If underconfidence is less costly than overconfidence and the cost of becoming more accurate is more than the benefit of being more accurate than stay underconfident.

This seems to work just as well as saying this:

If overconfidence is less costly than underconfidence and the cost of becoming more accurate is more than the benefit of being more accurate than stay overconfident.

Which can really be generalized to this:

If it costs more to change your confidence than the resulting benefit, do not change.

Which just leads us back to mattnewport's comment about optimal confidence. It also seems like it was not the point you were trying to make, so I assume I made a mistake somewhere. As best as I can tell, it was underemphasizing the two situational claims. As a result, I fully understand the request for more support in that area.

It also leaves potential in reserve in case of emergencies. As being accurately-confident tends to let others know what you can do, it's often desirable to create a false appearance.

Acting overconfident is another form of bluffing. Also, acting one way or the other is a little different than understanding your own limits. How does it help if you bluff yourself?

Comment author: Annoyance 21 April 2009 01:54:10PM *  -2 points [-]

"Overconfidence and underconfidence both imply a non-optimal amount of confidence."

Not in the sense of logical implication. The terms refer to levels of confidence greater or lesser than they should be, with the criteria utilized determining what 'should' means in context. The utility of the level of confidence isn't necessarily linked to its accuracy.

Although accuracy is often highly useful, there are times when it's better to be inaccurate, or to be inaccurate in a particular way, or a particular direction.

"You seem to have a pattern of responding to posts with unsupported statements"

I can support my statements, and support my supports, and support my support supports, but I can't provide an infinite chain of supports. No one can. The most basic components of any discussion stand by themselves, and are validated or not by comparison with reality. Deal with it.

"that appear designed more to antagonize than to add useful information to the conversation"

They're crafted to encourage people to think and to facilitate that process to the degree to which that is possible. I can certainly see how people uninterested in thinking would find that unhelpful, even antagonizing. So?

Comment author: quasimodo 16 June 2012 06:26:42AM 0 points [-]

Why is confidence or lack thereof an issue aside from personal introspection?

Comment author: beoShaffer 17 June 2012 07:02:41AM 1 point [-]

If you are under confident you may pass up risky but worthwhile opportunities, or spend resources on unnecessary safety measures. As for over confidince see hubris. Also welcome to less wrong.

Comment author: HughRistik 22 April 2009 11:01:26PM *  1 point [-]

This post reminds me of the phrase "cognitive hyper-humility," used by Ben Kovitz's Sophistry Wiki:

Demand for justification before making a move. Of course, this is not always sophistry. In some special areas of life, such as courtroom trials, we demand that a "burden" of specific kinds of evidence be met as a precondition for taking some action. Sophistry tends to extend this need for justification far beyond the areas where it's feasible and useful. Skeptical sophistry tends to push a sort of cognitive hyper-humility, or freezing out of fear of ever being "wrong"--or even being right but not fully justified. If you were to reason as the skeptic suggests that you should reason, you'd never be able to do anything in real life, because you'd never have sufficiently articulated and proven a priori principles to get started, nor evidence to justify your actions according to those principles, nor time to think this stuff through to the demanded degree.

Comment author: Jayson_Virissimo 22 April 2009 01:07:15AM *  1 point [-]

Does the fact that I find this guy's formulation of the cosmological argument somewhat persuasive mean that I can't hang out with the cool kids anymore? I'm not saying it is an airtight argument, just that it isn't obviously meaningless or ridiculous metaphysics.

Comment author: ektimo 20 April 2009 05:25:12PM *  1 point [-]

If there was a message I could send back to my younger self this would be it. Plus that if it's hard, don't try to make it easier, just keep in mind that it's important. (By younger self, I mean 7-34 years old.)

Comment author: PhilGoetz 20 April 2009 04:02:25PM *  1 point [-]

IHAPMOE, but the post seems to assume that a person's "rationality" is a float rather than a vector. If you're going to try to calibrate your "rationality", you'd better try to figure out what the different categories of rationality problems there are, and how well rationality on one category of problems correlates with rationality on other categories. Otherwise you'll end up doing something like having greater confidence in your ethical judgements because you do well at sudoku.

Comment author: [deleted] 20 April 2009 12:36:52PM 1 point [-]

This seems like a reflection of a general problem people have, the problem of not getting things done - more specifically, the problem of not getting things done by convincing yourself not to do them.

It's so much easier to NOT do things than do them, so we're constantly on the lookout for ways not to do them. Of course, we feel bad if we simply don't do them, so we first have to come up with elaborate reasons why it's ok - "I'll have plenty of time to do it later", "There's too much uncertainty", "I already got alot of work done today", etc. The underconfidence you're describing seems like another attempt at this rather than a peculiar habit of rationalists.

I try to fight this, semi-successfully, by remembering that it's only the RESULT that matters. If I want something, it doesn't matter what clever words or arguments I make to myself about doing it; in the end I either get it or I don't. And there's certainly nothing rational about convincing yourself to not get something you want; rationalists WIN, after all.

Comment author: dclayh 21 April 2009 11:18:57PM *  1 point [-]

It's so much easier to NOT do things than do them, so we're constantly on the lookout for ways not to do them.

In CS, laziness is considered a virtue, principally (I believe) because being too lazy to just do something the hard (but obvious) way tends to lead to coming up with an easy (clever) way that's probably faster and more elegant.

And there's certainly nothing rational about convincing yourself to not get something you want

But what if you convince yourself not to want it?

Comment author: AshwinV 09 October 2014 11:38:27AM 0 points [-]

Last paragraph, open parentheses missing. (I'm on a typo roll it seems)