Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

owencb comments on Be secretly wrong - Less Wrong

32 Post author: Benquo 10 December 2016 07:06AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (47)

You are viewing a single comment's thread. Show more comments above.

Comment author: owencb 10 December 2016 02:10:39PM 3 points [-]

I'm not sure exactly what you meant, so not ultimately sure whether I disagree, but I at least felt uncomfortable with this claim.

I think it's because:

  • Your framing pushes towards holding beliefs rather than credences in the sense used here.
  • I think it's generally inappropriate to hold beliefs about the type of things that are important and you're likely to turn out to be wrong on. (Of course for boundedly rational agents it's acceptable to hold beliefs about some things as a time/attention-saving matter.)
  • It's normally right to update credences gradually as more evidence comes in. There isn't so much an "I was wrong" moment.

On the other hand I do support generating explicit hypotheses, and articulating concrete models.

Comment author: Benquo 10 December 2016 07:23:51PM *  5 points [-]

I think this clarifies an important area of disagreement:

I claim that there are lots of areas where people have implicit strong beliefs, and it's important to make those explicit to double-check. Credences are important for any remaining ambiguity, but for cognitive efficiency, you should partition off as much as you can as binary beliefs first, so you can do inference on them - and change your mind when your assumptions turn out to be obviously wrong. This might not be particularly salient to you because you're already very good at this in many domains.

This is what I was trying to do with my series of blog posts on GiveWell, for instance - partition off some parts of my beliefs as a disjunction I could be confident enough in to think about it as a set of beliefs I could reason logically about. (For instance, Good Ventures either has increasing returns to scale, or diminishing, or constant, at its given endowment.) What remains is substantial uncertainty about which branch of the disjunction we're in, and that should be parsed as a credence - but scenario analysis requires crisp scenarios, or at least crisp axes to simulate variation along.

Another way of saying this is that from many epistemic starting points it's not even worth figuring out where you are in credence-space on the uncertain parts, because examining your comparatively certain premises will lead to corrections that fundamentally alter your credence-space.

Comment author: owencb 10 December 2016 11:32:40PM 1 point [-]

This was helpful to me, thanks.

I think I'd still endorse a bit more of a push towards thinking in credences (where you're at a threshold of that being a reasonable thing to do), but I'll consider further.

Comment author: ChrisHibbert 10 December 2016 05:42:07PM 0 points [-]

I'm all about epistemology. (my blog is at pancrit.org) But in order to engage in or start a conversation, it's important to take one of the things you place credence in and advocate for it. If you're wishy-washy, in many circumstances, people won't actually engage with your hypothesis, so you won't learn anything about it. Take a stand, even if you're on slippery ground.

Comment author: Benquo 10 December 2016 07:26:52PM *  1 point [-]

Per my reply to Owen, I think fine to say "X% A, 100-X% not-A" as a way to start a discussion, and even to be fuzzy about the %, but it's then important to be pretty clear about the structure of A and not-A, and to have some clear "A OR not-A" belief, and beliefs about what it looks like if A is true vs false.