Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

[LINK] Fate of our meta-stable Universe: killed by a vacuum bubble?

0 Post author: shminux 19 February 2013 10:29PM

The talk by Joseph Lykken (Lykken, ironically, is Norwegian for luck), making the science news rounds today, conjectures that Higgs will some day destroy the universe in a flash of a "true vacuum" expanding at light speed and destroying our "false vacuum", and everything else in it. Here is the original paper. The interesting part for me is not the idea, which is not at all new, but the related anthropic reasoning, which goes as follows:

That alternate universe would be "much more boring," Lykken said. Which led him to ask a philosophical question: "Why do we live in a universe that's just on the edge of stability?" He wondered whether a universe has to be near the danger zone to produce galaxies, stars, planets ... and life.

Cue Frost's Fire and Ice...

 

Comments (8)

Comment author: Mitchell_Porter 19 February 2013 11:53:23PM 3 points [-]

A much more interesting possibility is that the Higgs is on this threshold for a non-anthropic reason.

A 2009 paper correctly predicted the Higgs mass by assuming that the standard model plus gravity has the property of "asymptotic safety", a technical property of the "renormalization group flow" which would have implications for the values of coupling constants at the Planck scale. Asymptotic safety apparently contradicts black hole thermodynamics and may simply be a wrong hypothesis about the mathematical properties of the theory, but there may be other ways to motivate the argument.

Another ingredient of the argument is the existence of a "grand desert" between the weak scale and the Planck scale: no new physics in that range of energies, so that the posited extrapolation of weak-scale Higgs mass from Planck-scale boundary conditions can go through. That runs against the theoretical trend of the past forty years, which posits unification, supersymmetry...

In fact, one of the arguments for new physics at the LHC has been that there must be new particles at the weak scale, in order to stabilize the Higgs mass at that order of magnitude ("hierarchy problem"). Only the Higgs has shown up so far, so if the Higgs can be stabilized by conditions at the Planck scale, but only by supposing this desert of no new physics - that certainly encourages the doubters who never believed in all those increasingly baroque theoretical constructions.

At the same time, we need to explain dark matter and neutrino masses somehow, and the attempt to explain that with new physics below the weak scale looks rather contrived; but maybe that can be accomplished with a new symmetry principle... To me, all those debates, which involve construction of detailed predictive models, seem much more like the future of physics, rather than the anthropic vagueness.

Comment author: EHeller 20 February 2013 03:54:47AM 2 points [-]

Neutrino masses potentially need no new physics. They fit the SU(3) x SU(2) x U(1) symmetry of the standard model (they are the only dimension 5 operator that fits the symmetry). If we give up the t'Hooft principal that physical models need to be renormalizable (after all, its not true with GR or the standard model!) then of course neutrinos have mass.

Comment author: Eliezer_Yudkowsky 22 February 2013 05:36:44AM 2 points [-]

Would we actually notice if our amplitude was constantly diminishing by such a tiny factor? I wonder what that would be like to be...

Comment author: shminux 22 February 2013 06:08:38PM *  0 points [-]

I'm not sure that's how it works. As I understand it, the model predicts bubble nucleation, not state leakage, due to spatial inhomogeneities: a local energy fluctuation leads to a true-vacuum bubble forming and expanding.

However, let's leave aside the spatial inhomogeneity for the moment. As I understand it, tunneling in many worlds would result in a continuum of decayed worlds being continuously spawned, all with equal and infinitesimal probability, with the un-decayed one slowly decreasing in probability.

Assuming a Schrodinger cat-type experiment, with Eliezer being the cat, and assuming that Eliezer dies in every decayed world (not an unreasonable assumption if it's vacuum that decays), and assuming quantum immortality-type ontology (quite a number of assumptions), Eliezer will only ever perceive the surviving branch, with no measurable leakage.

Comment author: ikrase 22 February 2013 10:54:53AM 0 points [-]

Can there be a defense mechanism for this? The only think I can think of is something involving virtualization and arranging things for a virtualization platform to exist in the transformed universe, and that seems iffy. Granted I don't know much about this.

Comment author: karlrand 21 February 2013 12:22:52PM 0 points [-]

"Why do we live in a universe that's just on the edge of stability?" Because the form of life we represent can only exist in such a universe? Because ( excuse me hinting at something potentialy theological) the 'Creator' of our universe finds such realms more interesting than others? On a more serious note why should we reject any proposition simply because it 'runs against the theoretical trend of the past forty years,'

Comment author: shminux 21 February 2013 07:05:48PM 0 points [-]

why should we reject any proposition simply because it 'runs against the theoretical trend of the past forty years,

Not sure what you mean by this. The only test of a proposition is an experimental one, though an apriori likelihood certainly depends on how well it matches existing tested models.

Comment author: karlrand 22 February 2013 10:03:20AM 0 points [-]

Simply that a truth value being attached to a theoretical trend because 'it's been running for the past forty years' strikes me as mere subjectivism. As to ''matching existing tested models' consider the vast number of events taking place during, say, a cyclotron collision at CERN. There are so many only a small proportion can be detected and recorded. This engenders the creation of specific software aimed at detecting a limited range of anticipated events. Self fulfilling profacy could be one result.