Horrible LHC Inconsistency

17 Eliezer_Yudkowsky 22 September 2008 03:12AM

Followup to: When (Not) To Use Probabilities, How Many LHC Failures Is Too Many?

While trying to answer my own question on "How Many LHC Failures Is Too Many?" I realized that I'm horrendously inconsistent with respect to my stated beliefs about disaster risks from the Large Hadron Collider.

First, I thought that stating a "one-in-a-million" probability for the Large Hadron Collider destroying the world was too high, in the sense that I would much rather run the Large Hadron Collider than press a button with a known 1/1,000,000 probability of destroying the world.

But if you asked me whether I could make one million statements of authority equal to "The Large Hadron Collider will not destroy the world", and be wrong, on average, around once, then I would have to say no.

Unknown pointed out that this turns me into a money pump.  Given a portfolio of a million existential risks to which I had assigned a "less than one in a million probability", I would rather press the button on the fixed-probability device than run a random risk from this portfolio; but would rather take any particular risk in this portfolio than press the button.

Then, I considered the question of how many mysterious failures at the LHC it would take to make me question whether it might destroy the world/universe somehow, and what this revealed about my prior probability.

If the failure probability had a known 50% probability of occurring from natural causes, like a quantum coin or some such... then I suspect that if I actually saw that coin come up heads 20 times in a row, I would feel a strong impulse to bet on it coming up heads the next time around.  (And that's taking into account my uncertainty about whether the anthropic principle really works that way.)

Even having noticed this triple inconsistency, I'm not sure in which direction to resolve it!

(But I still maintain my resolve that the LHC is not worth expending political capital, financial capital, or our time to shut down; compared with using the same capital to worry about superhuman intelligence or nanotechnology.)

How Many LHC Failures Is Too Many?

16 Eliezer_Yudkowsky 20 September 2008 09:38PM

Recently the Large Hadron Collider was damaged by a mechanical failure.  This requires the collider to be warmed up, repaired, and then cooled down again, so we're looking at a two-month delay.

Inevitably, many commenters said, "Anthropic principle!  If the LHC had worked, it would have produced a black hole or strangelet or vacuum failure, and we wouldn't be here!"

This remark may be somewhat premature, since I don't think we're yet at the point in time when the LHC would have started producing collisions if not for this malfunction.  However, a few weeks(?) from now, the "Anthropic!" hypothesis will start to make sense, assuming it can make sense at all.  (Does this mean we can foresee executing a future probability update, but can't go ahead and update now?)

As you know, I don't spend much time worrying about the Large Hadron Collider when I've got much larger existential-risk-fish to fry.  However, there's an exercise in probability theory (which I first picked up from E.T. Jaynes) along the lines of, "How many times does a coin have to come up heads before you believe the coin is fixed?"  This tells you how low your prior probability is for the hypothesis.  If a coin comes up heads only twice, that's definitely not a good reason to believe it's fixed, unless you already suspected from the beginning.  But if it comes up heads 100 times, it's taking you too long to notice.

So - taking into account the previous cancellation of the Superconducting Supercollider (SSC) - how many times does the LHC have to fail before you'll start considering an anthropic explanation?  10?  20?  50?

After observing empirically that the LHC had failed 100 times in a row, would you endorse a policy of keeping the LHC powered up, but trying to fire it again only in the event of, say, nuclear terrorism or a global economic crash?

Hiroshima Day

1 Eliezer_Yudkowsky 06 August 2008 11:15PM

On August 6th, in 1945, the world saw the first use of atomic weapons against human targets.  On this day 63 years ago, humanity lost its nuclear virginity.  Until the end of time we will be a species that has used fission bombs in anger.

Time has passed, and we still haven't blown up our world, despite a close call or two.  Which makes it difficult to criticize the decision - would things still have turned out all right, if anyone had chosen differently, anywhere along the way?

Maybe we needed to see the ruins, of the city and the people.

Maybe we didn't.

There's an ongoing debate - and no, it is not a settled issue - over whether the Japanese would have surrendered without the Bomb.  But I would not have dropped the Bomb even to save the lives of American soldiers, because I would have wanted to preserve that world where atomic weapons had never been used - to not cross that line.  I don't know about history to this point; but the world would be safer now, I think, today, if no one had ever used atomic weapons in war, and the idea was not considered suitable for polite discussion.

I'm not saying it was wrong.  I don't know for certain that it was wrong.  I wouldn't have thought that humanity could make it this far without using atomic weapons again.  All I can say is that if it had been me, I wouldn't have done it.

A Genius for Destruction

9 Eliezer_Yudkowsky 01 August 2008 07:25PM

This is a question from a workshop after the Global Catastrophic Risks conference.  The rule of the workshop was that people could be quoted, but not attributed, so I won't say who observed:

"The problem is that it's often our smartest people leading us into the disasters.  Look at Long-Term Capital Management."

To which someone else replied:

"Maybe smart people are just able to work themselves up into positions of power, so that if damage gets caused, the responsibility will often lie with someone smart."

continue reading »

When (Not) To Use Probabilities

28 Eliezer_Yudkowsky 23 July 2008 10:58AM

Followup toShould We Ban Physics?

It may come as a surprise to some readers of this blog, that I do not always advocate using probabilities.

Or rather, I don't always advocate that human beings, trying to solve their problems, should try to make up verbal probabilities, and then apply the laws of probability theory or decision theory to whatever number they just made up, and then use the result as their final belief or decision.

The laws of probability are laws, not suggestions, but often the true Law is too difficult for us humans to compute.  If P != NP and the universe has no source of exponential computing power, then there are evidential updates too difficult for even a superintelligence to compute - even though the probabilities would be quite well-defined, if we could afford to calculate them.

So sometimes you don't apply probability theory.  Especially if you're human, and your brain has evolved with all sorts of useful algorithms for uncertain reasoning, that don't involve verbal probability assignments.

Not sure where a flying ball will land?  I don't advise trying to formulate a probability distribution over its landing spots, performing deliberate Bayesian updates on your glances at the ball, and calculating the expected utility of all possible strings of motor instructions to your muscles.

continue reading »

Should We Ban Physics?

11 Eliezer_Yudkowsky 21 July 2008 08:12AM

Nobel laureate Marie Curie died of aplastic anemia, the victim of radiation from the many fascinating glowing substances she had learned to isolate.

How could she have known?  And the answer, as far as I can tell, is that she couldn't.  The only way she could have avoided death was by being too scared of anything new to go near it.  Would banning physics experiments have saved Curie from herself?

But far more cancer patients than just one person have been saved by radiation therapy.  And the real cost of banning physics is not just losing that one experiment - it's losing physics.  No more Industrial Revolution.

Some of us fall, and the human species carries on, and advances; our modern world is built on the backs, and sometimes the bodies, of people who took risks.  My father is fond of saying that if the automobile were invented nowadays, the saddle industry would arrange to have it outlawed.

But what if the laws of physics had been different from what they are?  What if Curie, by isolating and purifying the glowy stuff, had caused something akin to a fission chain reaction gone critical... which, the laws of physics being different, had ignited the atmosphere or produced a strangelet?

continue reading »

View more: Prev