## CERN team claims measurement of neutrino speed >c

The implications of this experiment are not as relevant as the fact that
the interpretation is incorrect. These neutrinos simply didn't break the
speed of light barrier and as a result any further extrapolation is
unnecessary. The reasoning behind this is as follows:

1. Einstein showed that it cannot be done.

2. A mass containing object that reaches the speed of light stops moving.
If these neutrinos were able to exceed the speed of light then they
would not have reached the target facility and therefore could not be
observed in order to have their speed measured.

3. Transmogrification of sub-atomic particles is impossible. If the
neutrinos that are being sent from CERN are not the same sub-atomic
particles being observed at the target facility, then they are
measuring the speed of different objects.

4. As the observers affect the observation, since there are two different
facilities in the experiment, each with different observers, the
observer's speed of light at the CERN facility is different to the
observer's speed of light at the target facility and therefore the
difference in these speeds of light will affect the experiment.

Mentor
 Quote by PeterDonis I posted earlier in this thread that my initial guess is that the physical distance would be slightly shorter, but I haven't calculated the effect.
GR effects are typically 10-10, and we need something bigger than 10-5.

The first-order GR effect is just plain "falling", and that's 60 microns. We need 60 feet.

 Quote by xts That is also something smelling a bit fishy in OPERA experiment. At CERN they measure strong pulse of millions of muons, while at LNGS they detect single muons. So the shape of the signal, its discrimination, etc. plays pretty significant role. The paper again does not explain clearly how do they correct for such issues.
Again, this is trivial, and probably not worth reporting in detail. Anyone competent can do this. If you have a CFD, it's easier, otherwise, you measure the slewing and correct for it. It's also hard to get a 60ns offset from a device that has a rise time of a few ns.

Blog Entries: 9
Recognitions:
Gold Member
 Quote by Vanadium 50 The first-order GR effect is just plain "falling", and that's 60 microns. We need 60 feet.
Oh, well, another beautiful theory spoiled by an ugly fact. Thanks, this gives a good sense of the relative order of magnitude of GR effects for this experiment.
 Did anyone else watch the press conference? I watched it, although I didn't fully understand what they were talking about it sounded fairly good, being very cautious in their claims and very open to questions.

Mentor
 Quote by Vanadium 50 GR effects are typically 10-10, and we need something bigger than 10-5.
Also, things like Sagnac effect are several orders of magnitude too small.
 Just one more (silly) doubt. They base on a collection of independent measurements, each of them having statistical error of 2.8 microseconds (they come from close to flat distribution of 10.5 microsecond width). How have they made 6.9 nanosecond of final statistical error, while having only 16,111 events total?

Recognitions:
Gold Member
 Quote by ColonialBoy One of the diagrams in the paper refers to both locations using GPS to derive local time, and the diagram shows a single satellite. In fact GPS uses at least 3 and possibly 5 satellites, and each satellite has its own atomic clock. The satellites all transmit on the same frequency, and when the signals arrive back on earth, timing differences as well as relativistic effects are combined to give local time. I'd like to see how accurate they BELIEVE their local clocks are. Sub-nanosecond?
Their local clocks are certainly much better than sub-nanosecond. Note that one nanosecond is huge time-interval in modern time metrology (there are single-chip clocks with an 100s Allen deviation better then 10^-11). Moreover, time transfer with an accuracy better than a few ns is more or less trivial. Hence, it is very unlikely that there are any systematic errors due to time-keeping or time-transfer. Even a normal off-the-shelf commercial GPS clock will conform to the UTC with better than around +-30 ns.

Also, note that both METAS and PTB have been involved in the time keeping/transfer, there is virtually no chance that people from those organizations would both overlook any mistakes since this is quite literally what they do every day (both PTB and METAS operate atomic clocks that are part of the UTC) .
.
 http://cdsweb.cern.ch/collection/Video%20Lectures That's the link but it looks like they haven't uploaded the video yet. It was streaming only 10 minutes ago.

Mentor
 Quote by xts Just one more (silly) doubt.
Please look at Figure 11 in the paper.

Mentor
 Quote by gambit7 Is it a viable check to undertake the suggestion of replicating the energies of the 1987 supernova?
No. The energies are too small by a factor of 1000. The neutrinos cannot be efficiently produced nor efficiently detected at these energies.
 Here's an interesting interview with Ereditato & Autiero posted on Youtube: http://www.youtube.com/watch?v=AN9IQyHzk90 (not many details about the experiment itself, but you can see how open minded they are about the results...) This one gives a broader description of CERN's neutrino experiments/OPERA (for those of us without advanced degrees in physics): http://www.youtube.com/watch?v=M3aB_zUZ1c8
 Recognitions: Gold Member OK can anyone explain this to me (and maybe a few others) - have I got it right? The explanation of the astronomical evidence - supernova explosion - is not that obvious. I mean the neutrino pulse did arrive on earth before the light pulse. As it would if the neutrinos were faster than light. And to be only 3h apart after 160,000 years means they are impressively close in speed. And we can accept the light's excuse for lateness, that it got held up in this terribly dense traffic in the supernova (I will try it myself sometime). So that explains it away, I will accept that 3h is a reasonable estimate for such delay. But that is only saying there is no contradiction. We can't calculate nor observe the delay to the nearest billionth of a second I'm sure. So what Strassler seems to rely on is not the coincidence of the two pulses but the fact that the neutrinos arrived closely bunched, is that right? Now I know from scintillation counting that beta decays give off $\beta$'s with a spectrum of energies and I suppose the neutrinos have a spectrum of kinetic energies. If they have a spectrum of energies they must have a spectrum of velocities. But the observed spectrum of velocities is very narrow. So if what happens in supernovae is like what happened in my remembered scintillation counting and there is a spectrum of energies, the way you can have a broad energy spectrum and narrow velocity spectrum is, by SR, when they are travelling close to the speed of light. Was something like that the implicit argument? So close to speed of light, their rest mass must be very small. But close to doesn't quite tell me slower than or faster than.

 Quote by Vanadium 50 (silly doubt: 6.9ns error from 16,000 sample of 2800ns errored measurements) Please look at Figure 11 in the paper.
Fig.11??? It illustrates, that data shifted by 1048ns correction optically fit to the prediction, while originally they were shifted, and is not related to statistical errors.

I just doubt how you may average 16,000 sample of sigma=2800ns distribution to got final result with sigma 6.9ns. I would rather expect to have the final sigma at least sigma/sqrt(N) = 22ns.

The pulses are 10.5 microseconds wide. The protons (=> created neutrinios) are not distributed uniformly over that span, but as shown on an example pulse (fig.4) - its sigma is definitely bigger than 875ns (which would be the maximum for single event, allowing for final result sigma=7ns in absence of any other sources of statistical errors)
 Recognitions: Science Advisor One thing I don't quite understand.. They take the variance near the central value of the maximum likelihood function by assuming it is like a gaussian. OK. But why are they justified in using this value for the statistical uncertainty in delta t? Naively it seems like they are throwing out most of the information.

 Quote by Haelfix One thing I don't quite understand.. They take the variance near the central value of the maximum likelihood function by assuming it is like a gaussian. OK. But why are they justified in using this value for the statistical uncertainty in delta t? Naively it seems like they are throwing out most of the information.
Well, they're just experimentalists! ;)

They are merely publishing the results they've been having, they are not interpreting them.
 Can I ask a rather simple question? Until now, have neutrinos ever been observed at low speeds or rest? Or do we always see them travel at the speed of light, give or take small differences?

 Quote by McLaren Rulez Can I ask a rather simple question? Until now, have neutrinos ever been observed at low speeds or rest? Or do we always see them travel at the speed of light, give or take small differences?
I sincerely do not know, but if you're trying to infer a point with your question, I fail to see it.

The issue here is not the neutrino velocity, it is the apparent fact that it travels at superluminal speed, which should be impossible.

 Tags anisotropy, cern, ftl, gps, new math books

 Similar discussions for: CERN team claims measurement of neutrino speed >c Thread Forum Replies Special & General Relativity 6 High Energy, Nuclear, Particle Physics 4 Introductory Physics Homework 1