From an actual physicist:
Chang Kee Jung, a neutrino physicist at Stony Brook University in New York, says he'd wager that the result is the product of a systematic error. "I wouldn't bet my wife and kids because they'd get mad," he says. "But I'd bet my house."
I'll take bets at 99-to-1 odds against any information propagating faster than c. Note that this is not a bet for the results being methodologically flawed in any particular way, though I would indeed guess some simple flaw. It is just a bet that when the dust settles, it will not be possible to send signals at a superluminal velocity using whatever is going on - that there will be no propagation of any cause-and-effect relation at faster than lightspeed.
My real probability is lower, but I think that anyone who'd bet against me at 999-to-1 will probably also bet at 99-to-1, so 99-to-1 is all I'm offering.
I will not accept more than $20,000 total of such bets.
I'll take that bet, for a single pound on my part against 99 from Eliezer.
(explanation: I have a 98-2 bet with my father against the superluminal information propagation being true, so this sets up a nice little arbitrage).
Actually, what is the worst that could happen? It's not [the structure of the universe is destabilized by the breakdown of causality], because that would have already happened if it were going to.
The obvious one would be [Eliezer loses $20,000], except that would only occur in the event that it were possible to violate causality, in which case he would presumably arrange to prevent his past self from making the bet in the first place, yeah? So really, it's a win-win.
Unless one of the people betting against him is doing so because ve received a mysterious parchment on which was written, in ver own hand, "MESS WITH TIME."
It's not about transmitting information into the past - it's about the locality of causality. Consider Judea Pearl's classic graph with SEASONS at the top, SEASONS affecting RAIN and SPRINKLER, and RAIN and SPRINKLER both affecting the WETness of the sidewalk, which can then become SLIPPERY. The fundamental idea and definition of "causality" is that once you know RAIN and SPRINKLER, you can evaluate the probability that the sidewalk is WET without knowing anything about SEASONS - the universe of causal ancestors of WET is entirely screened off by knowing the immediate parents of WET, namely RAIN and SPRINKLER.
Right now, we have a physics where (if you don't believe in magical collapses) the amplitude at any point in quantum configuration space is causally determined by its immediate neighborhood of parental points, both spatially and in the quantum configuration space.
In other words, so long as I know the exact (quantum) state of the universe for 300 meters around a point, I can predict the exact (quantum) future of that point 1 microsecond into the future without knowing anything whatsoever about the rest of the universe. If I know the exact state for 3 meters around,...
This is starting to remind me of Kant. Specifically is attempt to provide an a priori justification for the then known laws of physics. This made him look incredibly silly once relativity and quantum mechanics came along.
And Einstein was better at the same sort of philosophy and used it to predict new physical laws that he thought should have the right sort of style (though I'm not trying to do that, just read off the style of the existing model). But anyway, I'd pay $20,000 to find out I'm that wrong - what I want to eliminate is the possibility of paying $20,000 to find out I'm right.
People in this thread with physics backgrounds should say so so that I can update in your direction.
When I looked at the paper, my impression is that it was a persistent result in the experiment, which would explain publication: the experiment's results will be public and someone, eventually, will notice this in the data. Better that CERN officially notice this in the data than Random High Energy Physicist. People relying on CERN's move to publish may want to update to account for this fact.
Lets say you're a physicist maximizing utility. It's pretty embarrassing to publish results with mistakes in them and the more important the results the more embarrassing it would be to announce results later shown to be the product of some kind of incompetence. So one can usually expect published results of serious import to have been checked over and over for errors.
But the calculus changes when we introduce the incentive of discovering something before anyone else. This is particularly the case when the discovery is likely to lead to a Nobel prize. In this case a physicist might be less diligent about checking the work in order to make sure she is the first out with the new results.
Now in this case CERN-OPERA is pretty much the only game in town. No one else can measure this many neutrinos with this kind of accuracy. So it would seem like they could take all the time they needed to check all the possible sources of error. But if Hyena is right that OPERA's data is/was shortly going to be public then they risk someone outside CERN-OPERA noticing the deviation from expected delay and publishing the results. By itself that is pretty embarrassing and it introduces some controversy ...
Relevant: The Beauty of Settled Science
I'm waiting for another experiment before I get too worked up about this result.
That MINOS saw something like this before is pretty interesting. Other thing to consider is SN1987A-- at the rate the CERN neutrinos were traveling we should have detected neutrinos of SN1987A four years before it was visible.
The fact that this was made public like this suggests they are very confident they haven't made any obvious errors.
This paper discusses the possibility of neutrino time travel.
There is a press conference at 10 AM EST.
I'll say 0.9 non-trivial experimental set-up error (no new physics but nothing silly either). 0.005 something incompetent or fraudulent. Remainder is new physics "something I don't know about, "neutrinos sometimes travel backwards in time" and "special relativity is wrong" 8000:800:1.
Perhaps the end of the era of the light cone and beginning of the era of the neutrino cone?
Does that work? Once you beat light don't you just win the speed race? The in-principle upper bound on what can be influenced just disappears. The rest is just engineering. Trivial little details of how to manufacture a device that emits a finely controlled output of neutrinos purely by shooting other neutrinos at something.
I strongly suspect that this is due to human error (say 95%). A few people in this thread are batting around much higher probability but given that this isn't a bunch of crackpots but are researchers at CERN this seems like overconfidence. (1-10^-8 is really, really confident.) The strongest evidence that this is an error is that it isn't being produced at much faster than the speed of light but only a tiny bit over.
I'm going to now proceed to list some of the 5%. I don't know enough to discuss their likelyhood in detail.
1) Neutrinos oscillating into a ...
Ok. I think there's one thing that should be stated explicitly in this thread that may not have been getting enough attention (and which in my own comments I probably should have been more explicit.)
The options are not "CERN screwed up" and "neutrinos can move faster than c." I'm not sure about the actual probabilities but P(neutrinos can move faster than c|CERN didn't screw up) is probably a lot less than P(Weird new physics that doesn't require faster than light particles|CERN didn't screw up).
My probability distribution of explanations:
Having read the preprint, about the only observation is that I think you’re overestimating the fraud hypothesis.
There’s almost a whole page of authors, the preprint describes only the measurement, and finishes with something like (paraphrasing) “we’re pretty sure of seeing the effect, but given the consequences of this being new physics we think more checking is needed, and since we’re stumped trying to find other sources of error, we publish this to give others a try too; we deliberately don’t discuss any possible theoretical implications.”
At the very least, this is the work of the aggregate group trying very hard to “do it right”; I guess there could still be one rogue data manipulator, but I would give much less than 1 in 20 that nobody else in the group noticed anything funny.
The comparison to parapsychology is a really poor one in this case-- for what should be pretty obvious reasons. For example, we know there is no file drawer effect. What we know about neutrino speed so far comes from a)Supernova measurements which contradict these results but measured much lower energy neutrinos and b)direct measurements that didn't have the sample size or the timing accuracy to reveal the anomaly OPERA discovered.
But more importantly this was a six sigma deviation from theoretical prediction. As far as I know, that is unheard of in parapsychology.
We cannot treat physics the way we treat psychology.
Relevant updates:
John Costella has a fairly simple statistical analysis which strongly suggests that the the OPERA data is statistically significant (pdf). This of course doesn't rule out systematic problems with the experiment which still seem to be the most likely.
Costella has also proposed possible explanations of the data. See 1 and 2. These proposals focus on the idea of a short-lived tachyon. This sort of explanation helps explain the SN 1987a data. Costella points out that if the muon-neutrino pair is becoming tachyonic through the initial hadron ba...
More relevant papers:
"Neutrinos Must Be Tachyons" (1997)
Abstract: The negative mass squared problem of the recent neutrino experiments from the five major institutions prompts us to speculate that, after all, neutrinos may be tachyons. There are number of reasons to believe that this could be the case. Stationary neutrinos have not been detected. There is no evidence of right handed neutrinos which are most likely to be observed if neutrinos can be stationary. They have the unusual property of the mass oscillation between flavors which has not be...
The neutrinos are not going faster than light. P = 1-10^-8
Error caused by some novel physical effect: P = 0.15
Human error accounts for the effect (i.e. no new physics): P= 0.85
This isn't even worth talking about unless you know a serious amount about the precise details of the experiment.
EDIT: Serious updating on the papers Jack links to downthread. I hadn't realised that neutrinos have never been observed going slower than light. P = no clue whatsoever.
There's now a theoretical paper up on the arxiv discussing a lot of these issues . The authors are respected physics people it seems. I have neither the time nor the expertise to evaluate it, but they seem to be claiming a resolution between the OPERA data and the SN 1987A data.
The best short form critique of this announcement I have seen is the post by theoretical physicist Matthew Buckley on the metafilter website:
After I read that comment I clicked through to his personal website and I found a nifty layman's explanation of the necessity for Dark Matter in current cosmo theoy:
Matt's web essay on dark matter.
If you don't have time to read his comment, what he says is that the results are not obviously bogus but they are so far-fetched that almost no physicists will find their daily work affected by the provisional...
Sean Carroll has made a second blog post on the topic, to explain why faster-than-light neutrinos do not necessarily imply time travel.
...The usual argument that faster than light implies the ability to travel on a closed loop assumes Lorentz invariance; but if we discover a true FTL particle, your first guess should be that Lorentz invariance is broken. (Not your only possible guess, but a reasonable one.) Consider, for example, the existence of a heretofore unobserved fluid pervading the universe with a well-defined rest frame, that neutrinos interact wit
To quote one of my professors, from the AP release:
...Drew Baden, chairman of the physics department at the University of Maryland, said it is far more likely that there are measurement errors or some kind of fluke. Tracking neutrinos is very difficult, he said.
"This is ridiculous what they're putting out," Baden said, calling it the equivalent of claiming that a flying carpet is invented only to find out later that there was an error in the experiment somewhere. "Until this is verified by another group, it's flying carpets. It's cool, but ..
Forgive my ignorance, but... if distance is defined in terms of the time it takes light to traverse it, what's the difference between "moving from A to B faster than the speed of light" and "moving from B to A"?
http://www.nature.com/news/2011/110922/full/news.2011.554.html
http://arxiv.org/abs/1109.4897v1
http://usersguidetotheuniverse.com/?p=2169
http://news.ycombinator.com/item?id=3027056
Perhaps the end of the era of the light cone and beginning of the era of the neutrino cone? I'd be curious to see your probability estimates for whether this theory pans out. Or other crackpot hypotheses to explain the results.