roko, Given that at least some phycisists have come up with vaguely plausible mechanisms for stable micro black hole creation, you should think about outrageous or outspoken claims made in the past by a small minority of scientists. How often has the majority view been overturned? I suspect that something like 1/1000 is a good rough guess for the probability of the LHC destroying us. This reasoning gives the probability 1/1000 for any conceivable minority hypothesis, which is inconsistent. In general, I think this debate only illustrates the fact that people are not good at all in guessing extremely low or extremely high probabilities and usually end up in some sort of inconsistency.
This reasoning gives the probability 1/1000 for any conceivable minority hypothesis, which is inconsistent.
Inconsistent with what? Inconsistent is a 2-place predicate.
It gives us different probabilities for different hypotheses, depending on the minority. The idea that global warming is not caused by human activity is currently believed by about 1-2% of climatologists.
If you have a hard time finding a theory that you can't, by this criterion, say is true with more than 999/1000 probability, I'd say that's a feature, not a bug.
Followup to: When (Not) To Use Probabilities, How Many LHC Failures Is Too Many?
While trying to answer my own question on "How Many LHC Failures Is Too Many?" I realized that I'm horrendously inconsistent with respect to my stated beliefs about disaster risks from the Large Hadron Collider.
First, I thought that stating a "one-in-a-million" probability for the Large Hadron Collider destroying the world was too high, in the sense that I would much rather run the Large Hadron Collider than press a button with a known 1/1,000,000 probability of destroying the world.
But if you asked me whether I could make one million statements of authority equal to "The Large Hadron Collider will not destroy the world", and be wrong, on average, around once, then I would have to say no.
Unknown pointed out that this turns me into a money pump. Given a portfolio of a million existential risks to which I had assigned a "less than one in a million probability", I would rather press the button on the fixed-probability device than run a random risk from this portfolio; but would rather take any particular risk in this portfolio than press the button.
Then, I considered the question of how many mysterious failures at the LHC it would take to make me question whether it might destroy the world/universe somehow, and what this revealed about my prior probability.
If the failure probability had a known 50% probability of occurring from natural causes, like a quantum coin or some such... then I suspect that if I actually saw that coin come up heads 20 times in a row, I would feel a strong impulse to bet on it coming up heads the next time around. (And that's taking into account my uncertainty about whether the anthropic principle really works that way.)
Even having noticed this triple inconsistency, I'm not sure in which direction to resolve it!
(But I still maintain my resolve that the LHC is not worth expending political capital, financial capital, or our time to shut down; compared with using the same capital to worry about superhuman intelligence or nanotechnology.)