You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Manfred comments on Open thread, 11-17 March 2014 - Less Wrong Discussion

3 Post author: David_Gerard 11 March 2014 10:45PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (226)

You are viewing a single comment's thread. Show more comments above.

Comment author: Lumifer 12 March 2014 01:22:02AM 4 points [-]

I think Knightian uncertainty is a very useful concept. Sometimes "I don't know" is the right answer. I can't estimate the probabilities, I have no evidence, no decent priors -- I just do not know. It's much better to accept that than to start inventing fictional probabilities.

Black Swan isn't a theory, it's basically a correct observation that statistical models of the world are limited in many important ways and depend on many implicit and explicit assumptions (a typical assumption is the stability of the underlying process). When an assumption turns out to be wrong the model breaks, sometimes in a spectacular way.

Nassim Taleb tried to make a philosophy out of that observation. I am not particularly impressed by it.

Comment author: Manfred 12 March 2014 04:27:40AM 8 points [-]

The trouble, of course, is that "I don't know" is not an action. If "I don't know means" "don't deviate from the status quo," that can be a bad plan if the status quo is bad.

Comment author: Lumifer 12 March 2014 04:29:08AM 1 point [-]

The trouble, of course, is that "I don't know" is not an action.

Yes, and why is this "trouble"?

Comment author: AlexSchell 12 March 2014 04:55:42AM 6 points [-]

The only point of probabilities is to have them guide actions. How does the concept of Knightian uncertainty help in guiding actions?

Comment author: hamnox 13 March 2014 03:14:40AM 1 point [-]

More concretely than Lumifer's answer, it would encourage you to diversify your plans, and try not to rely on leveraging any one model or enterprise. It also encourages you to play odds instead of playing it safe, because safe is rarely as safe as you think it is. Try new things regularly, since cost of doing them is generally linear while pay-off could easily be exponential.

That's what I got out of it, anyways.

Comment author: AlexSchell 15 March 2014 01:16:22AM 0 points [-]

I'm not actually sure the concept can do all that work, mostly because we don't have plausible theories for making decisions from imprecise probabilities (with probability we have expected utility maximization). See e.g. this very readable paper.

Comment author: Lumifer 12 March 2014 02:56:49PM 0 points [-]

The only point of probabilities is to have them guide actions.

I don't agree with that (a quick example is that speculating about the Big Bang is entirely pointless under this approach), but that's a separate discussion.

How does the concept of Knightian uncertainty help in guiding actions?

It allows you to not invent fake probabilities and suffer from believing you have a handle on something when in reality you don't.

Comment author: ShardPhoenix 14 March 2014 02:58:04AM 2 points [-]

a quick example is that speculating about the Big Bang is entirely pointless under this approach

Such speculation may help guide actions regarding future investments in telescopes, decisions on whether to try to look for aliens, etc.

Comment author: AlexSchell 15 March 2014 01:09:25AM *  0 points [-]

OK, I'll give you that we might non-instrumentally value the accuracy of our beliefs (even so, I don't know how unpack 'accuracy' in a way that can handle both probabilities and uncertainty, but I agree this is another discussion). I still suspect that the concept of uncertainty doesn't help with instrumental rationality, bracketing the supposed immorality of assigning probabilities from sparse information. (Recall that you claimed Knightian uncertainty was 'useful'.)