bill comments on Rationality, Cryonics and Pascal's Wager - Less Wrong

12 [deleted] 08 April 2009 08:28PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (54)

You are viewing a single comment's thread. Show more comments above.

Comment author: AnnaSalamon 09 April 2009 08:15:37PM 0 points [-]

He seems to be using 1/1000 as the cutoff for where human estimates of probability stop being accurate enough to base decisions on.

I doubt this is what Roko means. Probabilities are "in the mind"; they're our best subjective estimates of what will happen, given our incomplete knowledge and calculating abilities. In some sense it doesn't make sense to talk about our best-guess probabilities being (externally) "accurate" or "inaccurate". We can just make the best estimates we can make.

What can it mean for probabilities to "not be accurate enough to base decisions on"? We have to decide, one way or another, with the best probabilities we can build or with some other decision procedure. Is zero an accurate enough probability (of cryonics success, or of a given Pascal's wager-like situation) to base decisions on, if an estimated 1 in ten thousand or whatever is not?

Comment author: bill 09 April 2009 08:32:55PM *  1 point [-]

When dealing with health and safety decisions, people often need to deal with one-in-a-million types of risks.

In nuclear safety, I hear, they use a measure called "nanomelts" or a one-in-a-billion risk of a meltdown. They then can rank risks based on cost-to-fix per nanomelt, for example.

In both of these, though, it might be more based on data and then scaled down to different timescales (e.g. if there were 250 deaths per year in the US from car accidents = about 1 in a million per day risk of death from driving; use statistical techniques to adjust this number for age, drunkenness, etc.)