DanArmak comments on Open Thread: October 2009 - Less Wrong

5 Post author: gwern 01 October 2009 12:49PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (425)

You are viewing a single comment's thread. Show more comments above.

Comment author: DanArmak 02 October 2009 03:06:17PM 0 points [-]

Just saying "black swan" isn't enough to give higher probability. If you think I can't assign any meaningful probability at all to this scenario, why?

Comment author: billswift 02 October 2009 03:29:41PM *  1 point [-]

I don't believe anyone can assign meaningful very small or very large probabilities in most situations. It is one of my long-running disagreements with people here and on OB.

Comment author: DanArmak 02 October 2009 03:42:09PM 0 points [-]

There are indeed many known human biases of this kind, plus general inability to predict small differences in probability.

But we can't treat every low probability scenario as being e.g. of p=0.1 or some other constant! What do you suggest then?

Comment author: billswift 02 October 2009 05:25:55PM 0 points [-]

I don't know of a unified way of handling extremely small risks, but there are two things that can be helpful. First, as suggested by Marc Stiegler in "David's Sling", is to simply recognize explicitly that they are possible, that way if they do occur you can get on with dealing with the problem without also having to fight disbelief that it could have happened at all. Second, different people have different perspectives and interests and will treat different low possibility events differently, this sort of dispersion of views and preparation will help ensure that someone is at least somewhat prepared. As I said, neither of these is really enough, but I simply can't see any better options.

Comment deleted 02 October 2009 03:48:45PM [-]
Comment author: Vladimir_Nesov 02 October 2009 03:56:11PM 1 point [-]

You have to assign probabilities anyway. See the amended article:

Considering some event a black swan doesn't give a leave to not assign any probabilities, since making decisions depending on the plausibility of such event is still equivalent to assigning probabilities that make the expected utility calculation give those decisions.

Comment deleted 02 October 2009 04:20:30PM *  [-]
Comment author: DanArmak 02 October 2009 04:44:41PM 1 point [-]

How much our civilization is worth? Say, 10^20 dollars.

That's meaningless. You can't assign a value in dollars to the continued existence of our civilization. Dollars are only useful for pricing things inside that civilization. (Some people argue for using utilons to price the civilization's existence.)

If I had the money, I would be willing to part with 10^6 dollars to develop, manufacture, and distribute the book. Therefore, the probability of the book serving it's primary purpose is 10^(-14).

The amount you're willing to pay is a fact about you, not about the book's usefulness. You're saying you estimate its probability of usefulness at 10^-14. But why?

Comment author: wedrifid 02 October 2009 04:56:55PM 2 points [-]

Clearly the market for civilization creation books is efficient.

Comment author: DanArmak 02 October 2009 05:03:20PM *  1 point [-]

Nice point. Maybe we should instead talk about scenarios where humanity (including us) no longer suffers aging but a collapse still occurs.

Incidentally, I wonder what the market price for writing a civilization-destroying book might be?

Comment author: wedrifid 02 October 2009 05:06:18PM 3 points [-]

I believe the going rate is 45 virgins in the afterlife.