Yvain comments on A Thought on Pascal's Mugging - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (159)
Lifeboat Foundation fits my criteria of "reasonable", as do some of the commenters here. Even if there's only a one in a million risk of destroying the world, that's still equivalent to killing 6,000 people with probability one; potentially destroying the Universe should require even more caution.
There's not even a one in a million; it's closer to "But there's still a chance, right?"
And you're still dealing in probabilities too small to sensibly calculate in this manner and be saying anything meaningful - "switching on the LHC is equivalent to killing 6,000 people for certain" is a statement that isn't actually sensible when rendered in English, and I don't see another way to render in English your calculated result that switching it on is "equivalent to killing 6,000 people with probability one". But please do enlighten me.
(I realise you're multiplying 6E9 by 1E-6 and asserting that six billion conceptual millionth-of-a-person slivers equals six thousand actual existing people. "Shut up and multiply" doesn't stop me balking at this, and that the result says "switching on the LHC is equivalent to killing 6,000 people for certain" seems to constitute a reductio ad absurdum for however one gets there.)
Rees estimated the probability of the LHC destroying the world at 1 in 50 million, and it would be surprising if he were one of the few people in the world without overconfidence bias, or one of the few people in the world who doesn't underestimate global existential risks.
I assume from the first sentence that you believe an appropriate probability to have for the LHC destroying the world is less than one in a billion. Trusting anyone, even the world scientific consensus, with one in a billion probability, seems excessive to me - the world scientific consensus has been wrong on more than one in every billion issues it thinks it's sure about. If you're working not off the world scientific consensus but off your own intuition, that seems even stranger - if, for example, the LHC will destroy the world if and only if strangelets are stable at 10 TeEV, then you just discovered important properties about the stability of strangelets to p = < .000000001 certainty, which seems like the sort of thing you shouldn't be able to do without any experiments or mathematics. If you're working off of a general tendency for the world not to be destroyed, well, there were five mass extinction events in the past billion years, so ignoring for the moment the tendency of mass extinctions to take multiple years, that means the probability of a mass extinction beginning in any particular year is about 5/billion. If I were to tell you "The human race will become extinct the year the LHC is switched on", would you really tell me "Greater than 80% chance it has nothing to do with the LHC" and go about your business?
I am still uncomfortable with the whole "shut up and multiply" concept too. But I think that's where the "shut up" part comes in. You don't have to be comfortable with it. You don't have to like it. But if the math checks out, you just shut up and keep your discomfort to yourself, because math is math and bad things happen when you ignore it.
Here we run into the problem of "garbage in, garbage out."
He assigned 50% extinction risk for the 21st century in his book. His overall estimates of risk are quite high.
What your probability discussion there seems to me to be saying is "these numbers are too small to think about in any sensible way, let alone calculate." Trying to think about them closely resembles an argument that the way to deal with technological existential risk is to give up technology and go back to the savannah (caves are too techy).
But the math leads to statements like "switching on the LHC is equivalent to killing 6,000 people for certain", which seems to constitute a reductio ad absurdum of whatever process led to such a sentence.
(You could justify it philosophically, but you're likely to get an engineer's answer: "No it isn't. Here, I'll show you. (click) Now, how many individuals did that just kill?")
One day I would like to open up an inverse casino.
The inverse casino would be full of inverse slot machines. Playing the inverse slot machines costs negative twenty-five cents - that is, each time you pull the bar on the machine, it gives you a free quarter. But once every few thousand bar pulls, you will hit the inverse jackpot, and be required to give the casino several thousand dollars (you will, of course, have signed a contract to comply with this requirement before being allowed to play).
You can also play the inverse lottery. There are ten million inverse lottery tickets, and anyone who takes one will get one dollar. But if your ticket is drawn, you must pay me fifteen million dollars. If you don't have fifteen million dollars, you will have various horrible punishments happen to you until fifteen million dollars worth of disutility have been extracted from you.
If you believe what you are saying, it seems to me that you should be happy to play the inverse lottery, and believe there is literally no downside. And it seems to me that if you refused, I could give you the engineer's answer "Look, (buys ticket) - a free dollar, and nothing bad happened to me!"
And if you are willing to play the inverse lottery, then you should be willing to play the regular lottery, unless you believe the laws of probability work differently when applied to different numbers.
The hedge fund industry called. They want their idea of selling far out-of-the-money options back.
Doesn't this describe the standard response to cars?
Just think of all the low-probability risks cars subsume! Similarly, if you take up smoking you no longer need to worry about radon in your walls, pesticides in your food, air pollution or volcano dust. It's like a consolidation loan! Only dumber.
Sorry, I don't understand. Response to cars?
Most of life is structured as a negative lottery. You get in a car, you get where you're going much faster- but if the roulette ball lands on 00, you're in the hospital or dead. (If it only lands on 0, then you're just facing lost time and property.)
And so some people are mildly afraid of cars, but mostly people are just afraid of bad driving or not being in control- the negative lottery aspect of cars is just a fact of life, taken for granted and generally ignored when you turn the key.
The reason I recommend David not play the inverse lottery isn't because all things that give small rewards for a small probability of great loss are bad, it's because the inverse lottery (like the regular lottery) is set up so that the expected utility of playing is lower than the expected utility of not playing. An inverse lottery in which the expected utility of playing is better than the expected utility of not playing would be a good bet.
A good argument for driving cars wouldn't be that an accident could never happen and is ridiculous (which is how I interpret David's pro-LHC argument) but that the benefits gained from driving cars outweigh the costs.
In the case of your original assertion - that it was reasonable to worry about the risks of the LHC - the argument for the probability of disaster being too small to worry about is that we're not working out the probability assuming such events have never happened before - we're working out the probability assuming such events and stronger ones happen all the time, because they do. So very many collisions occur just near Earth of greater energies that this puts a strong upper bound on the chances of disaster occurring in the LHC itself. Even multiplied by 6E9, the number is, as I said, much less like 1E-6 and much more like "but there's still a chance, right?"
No. No, there really isn't.
This is plausible and I shall contemplate it.
By the way, and a little bit on topic, I think it's not a coincidence that an inverse casino would be more expensive to run than a regular casino.