CERN team claims measurement of neutrino speed >c

In summary, before posting in this thread, readers are asked to read three things: the section on overly speculative posts in the thread "OPERA Confirms Superluminal Neutrinos?" on the Physics Forum website, the paper "Measurement of the neutrino velocity with the OPERA detector in the CNGS beam" published on arXiv, and the previous posts in this thread. The original post discusses the potential implications of a claim by Antonio Ereditato that neutrinos were measured to be moving faster than the speed of light. There is a debate about the possible effects on theories such as Special Relativity and General Relativity, and the issue of synchronizing and measuring the distance over which the neutrinos traveled. The possibility
  • #71
TrickyDicky said:
What happened to the last posts by ? and rodhy?
Were they erased?
TrickyDicky,

Rhody here, yes, mine was, I was in a hurry this morning and in the interest of accuracy could have provided the link, which I will again, here, the http://www.guardian.co.uk/science/2011/sep/22/faster-than-light-particles-neutrinos" [Broken]. The experts will examine this paper with a fine tooth comb, and any weaknesses, errors will be found, if any. If there are none, results will need to be independently verified, and those results bounced against this one. Let's see what transpires at press conference at CERN today. It should prove interesting to say the least.

Rhody... :blushing: :cool:
 
Last edited by a moderator:
Physics news on Phys.org
  • #72
Vanadium 50 said:
It's not relevant. Essentially what they are doing is measuring the Lorentz-invariant interval between the production and detection of the neutrinos, and comparing that to a null interval. Since interval is a Lorentz invariant quantity, it doesn't matter what frame they worked it in.

Can you explain this more? If they are moving faster than c (the spacetime c), this would make the interval spacelike, and taken as particle worldline, would represent travel back in time. Also, while it doesn't matter what frame you compute it in, you must make measurements in some frame or frames to compute the interval, no?
 
  • #73
TrickyDicky said:
One thing I find remarkable is that (in the case the OPERA results are not flawed) many people are willing to shoot down relativity the way we know it, as heavy as as such a blow would be for the whole of the physics field.
However the results not only could involve relativity but the weak interaction, but nobody seems to question any of the assumptions wrt neutrinos as particles and the weak interaction as a theory even if the experimental results with neutrinos are not so brilliant and direct as the relativity ones.
I agree. After looking at the paper it is clear that this cannot be explained simply by photon mass, but that still leaves a lot of possibilities:
1) tachyonic neutrinos (not likely due to stellar observations)
2) SR violation (not likely due to large number of more sensitive experiments)
3) Experimental error (most likely, but would have to be subtle)
4) Change to standard model (somewhat likely, this is the primary purpose of doing particle physics after all)

I am sure there are others as well.
 
  • #74
Vanadium 50 said:
It's not relevant. Essentially what they are doing is measuring the Lorentz-invariant interval between the production and detection of the neutrinos
I may agree only partially. They do not measure intervals. They measure space-time co-ordinates of production point (using CERN ref. frame) and space-time co-ordinates of detection (using Gran-Sasso frame). In order to calculate interval they must transform those results to common frame. There are lots of possible errors to be made in this process, especially that neither of lab frames is inertial.

Vanadium 50 said:
If the experimenters are competent, [...]
... then we would never double check any experiments nor their analysis. Nor question their results.

Vanadium 50 said:
You get the electronics timing by checking the time difference between input and output on a scope.
Exactly - if I have two boxes in two racks in the same lab room, connected to one oscilloscope.
But here we have two sets of electronics, which (maybe) got compared once in CERN, then one of them got transported to Gran-Sasso. Since then they are running in frames having measureable relative time dilation and different environmental conditions (at least ambient pressure differs by 10kPa or so - CERN is 400m above sea level, Gran Sasso is in high mountains 1700m). Question - was those effects considered during the analysis? I doubt. The paper says nothing about that.

Vanadium 50 said:
The detector timing is a little trickier, but signal formation time for plastic scintillator and even a slow phototube is a few nanoseconds.
They use wavelenghtshifter fibres to collect light from scintillating plates. According to CERN Particle Detector BriefBook (http://rkb.home.cern.ch/rkb/PH14pp/node203.html#202 [Broken]) WLS blur the readout by 10-20ns, scintillator itself by single nanoseconds, photomultiplier - next single nanoseconds.
 
Last edited by a moderator:
  • #75
They definitely discuss frames used for different measurements, in the paper. I have not gone through to see that all factors I could think of are addressed, but they certainly do consider frames for each measurement:

"Since TOFc is computed with respect to the origin of the OPERA reference frame, located
beneath the most upstream spectrometer magnet, the time of the earliest hit for each event is
corrected for its distance along the beam line from this point, assuming a time propagation
according to the speed of light. The UTC time of each event is also individually corrected for the
instantaneous value of the time link correlating the CERN and OPERA timing systems."
 
  • #76
xts said:
They use wavelenghtshifter fibres to collect light from scintillating plates. According to CERN Particle Detector BriefBook (http://rkb.home.cern.ch/rkb/PH14pp/node203.html#202 [Broken]) WLS blur the readout by 10-20ns, scintillator itself by single nanoseconds, photomultiplier - next single nanoseconds.

nice observation!

Now, I wonder how they came up with an error of 10ns - any mention in the paper showing how the systematic error was actually calculated?
 
Last edited by a moderator:
  • #77
PAllen said:
They definitely discuss frames used for different measurements[...] is computed with respect to the origin of the OPERA reference frame,
I wouldn't call it a "discussion", rather "half-sentence mentioning"... I would really like to see what are the corrections they claim to use and if they cover by any means non-inertiality of Gran Sasso ref. frame.
 
  • #78
One of the diagrams in the paper refers to both locations using GPS to derive local time, and the diagram shows a single satellite. In fact GPS uses at least 3 and possibly 5 satellites, and each satellite has its own atomic clock. The satellites all transmit on the same frequency, and when the signals arrive back on earth, timing differences as well as relativistic effects are combined to give local time.

I'd like to see how accurate they BELIEVE their local clocks are. Sub-nanosecond?
 
  • #79
kfmfe04 said:
I wonder how they came up with an error of 10ns
That is achievable - their results are based on a statistic of 16,000 events, so the the exponential blur may be reduced. I do not question their statistical analysis (I haven't checked it in details, but on the first sight it seems to be correct).

kfmfe04 said:
any mention in the paper showing how the systematic error was actually calculated?
No. And that is my major point against that paper - they do not discuss those issues (except of GPS time synchronisation, to which they paid lots of attention, and geodetic measurements, they mentioned). And the numbers they present for various components of systematic errors seem to be much underestimated as for my intuitions. Some of the possible sources of systematic errors had not been even mentioned in the paper.
 
  • #80
ColonialBoy said:
I'd like to see how accurate they BELIEVE their local clocks are. Sub-nanosecond?
They estimate it as 1.7ns.
What seems to be much more difficult to believe is 20cm accuracy of baseline measurement (both labs are in deep locations, so you must go down geodetically from surface geodetic point through maze of tunnels)
I am also curious if they correctly corrected this baseline for SR dilation - you must remember that frames of CERN and Gran Sasso differ in velocity.
 
  • #81
DaleSpam said:
I agree. After looking at the paper it is clear that this cannot be explained simply by photon mass, but that still leaves a lot of possibilities:
1) tachyonic neutrinos (not likely due to stellar observations)
2) SR violation (not likely due to large number of more sensitive experiments)
3) Experimental error (most likely, but would have to be subtle)
4) Change to standard model (somewhat likely, this is the primary purpose of doing particle physics after all)

#4 is impossible. There is no way that a change to the SM of particle physics will let you have a neutrino interaction 60 ns before it arrives. (Or, alternatively, have it begin to travel 60 ns before it's produced. Or some combination)


xts said:
I may agree only partially. They do not measure intervals. They measure space-time co-ordinates of production point (using CERN ref. frame) and space-time co-ordinates of detection (using Gran-Sasso frame). In order to calculate interval they must transform those results to common frame. There are lots of possible errors to be made in this process, especially that neither of lab frames is inertial.

The way you would like to do this is have the light and the neutrinos start together and go through the same path. Of course that's impossible. So instead what you do is you set up a triangle, with light (well, radio) emerging from one point and being detected at the source and destination points. If you work this out, you will discover that the interval between the source and destination is independent of their relative motion.

SR/GR effects only matter in this problem if you have the radio pulse, wait (using local clocks to measure how much time elapses) and then do the experiment. The drift between a clock at CERN and one at LNGS is probably around 20-30 ns per day. But anyone who has used a GPS navigator knows that it syncs much more often than this - a few seconds at most. So these effects are completely zeroed out by the way the measurement is constructed.

xts said:
They use wavelenghtshifter fibres to collect light from scintillating plates. According to CERN Particle Detector BriefBook (http://rkb.home.cern.ch/rkb/PH14pp/node203.html#202 [Broken]) WLS blur the readout by 10-20ns, scintillator itself by single nanoseconds, photomultiplier - next single nanoseconds.

I do this for a living. Remember, for timing what matters is the rise time, not the total signal formation time. You get the worst timing in a calorimetric configuration, because there you want to collect all the light. This way, you have a total signal formation time around 60 ns (say 45-120 ns, depending on the detector) and can usually time in the leading edge to better than 5 ns. That is already good enough, but in a tracking configuration, using constant fraction discriminators, 1 ns is doable. OPERA claims 2.3 ns.

If I were charged with straightening this out, I'd be looking at the software for the Septentrio PolaRx2e. This is the specialized GPS receiver they had to use, and the desire to measure nanosecond-level timing over distances of hundreds of kilometers is probably not a common application. Uncommon applications means less well-tested software. I would also re-do the tunnel survey: GPS tells you where the antenna is. Finally, I'd redo the CERN survey. (GPS tells you where the antenna is) Both of those surveys should be done by independent teams who do not have access to the original surveys.
 
Last edited by a moderator:
  • #82
Vanadium 50 said:
SR/GR effects only matter in this problem if you have the radio pulse, wait (using local clocks to measure how much time elapses) and then do the experiment. The drift between a clock at CERN and one at LNGS is probably around 20-30 ns per day. But anyone who has used a GPS navigator knows that it syncs much more often than this - a few seconds at most. So these effects are completely zeroed out by the way the measurement is constructed.

I see this for the time measurement, but how about the distance measurement? It looks to me (and you appear to agree from what you say later in your post) like they are using GPS coordinates for the source and destination events and then calculating the distance between them. But the actual, physical distance the neutrinos have to travel will not be quite the same as the "Euclidean" distance calculated from the difference in GPS coordinates, because of GR effects. I posted earlier in this thread that my initial guess is that the physical distance would be slightly shorter, but I haven't calculated the effect. I also can't tell from the paper whether this was taken into account in the distance computation.
 
  • #83
Vanadium 50 said:
Remember, for timing what matters is the rise time, not the total signal formation time.
That is also something smelling a bit fishy in OPERA experiment. At CERN they measure strong pulse of millions of muons, while at LNGS they detect single muons. So the shape of the signal, its discrimination, etc. plays pretty significant role. The paper again does not explain clearly how do they correct for such issues.
 
  • #84
Vanadium 50 said:
If I were charged with straightening this out, I'd be looking at the software for the Septentrio PolaRx2e.

That's what most of our experimentalists think as well. A subtle software error or bug.
 
  • #85
PeterDonis said:
I see this for the time measurement, but how about the distance measurement? It looks to me (and you appear to agree from what you say later in your post) like they are using GPS coordinates for the source and destination events and then calculating the distance between them. But the actual, physical distance the neutrinos have to travel will not be quite the same as the "Euclidean" distance calculated from the difference in GPS coordinates, because of GR effects. I posted earlier in this thread that my initial guess is that the physical distance would be slightly shorter, but I haven't calculated the effect. I also can't tell from the paper whether this was taken into account in the distance computation.

An interesting thing to consider is what is the minimum number of dimensions they need to analyze the distance/time measurements. If they could really do it as (x,t) only, I believe you can generally achieve exact Minkowski flat embedding of a finite (x,t) surface, if you have complete freedom to define your coordinates. However, if using a triangle, it seems to me they need at least (x,y,t), in which case exact flatness is impossible over a finite region. I also have not calculated the scale of difference for the near Earth region.
 
  • #86
The implications of this experiment are not as relevant as the fact that
the interpretation is incorrect. These neutrinos simply didn't break the
speed of light barrier and as a result any further extrapolation is
unnecessary. The reasoning behind this is as follows:

1. Einstein showed that it cannot be done.

2. A mass containing object that reaches the speed of light stops moving.
If these neutrinos were able to exceed the speed of light then they
would not have reached the target facility and therefore could not be
observed in order to have their speed measured.

3. Transmogrification of sub-atomic particles is impossible. If the
neutrinos that are being sent from CERN are not the same sub-atomic
particles being observed at the target facility, then they are
measuring the speed of different objects.

4. As the observers affect the observation, since there are two different
facilities in the experiment, each with different observers, the
observer's speed of light at the CERN facility is different to the
observer's speed of light at the target facility and therefore the
difference in these speeds of light will affect the experiment.
 
  • #87
PeterDonis said:
I posted earlier in this thread that my initial guess is that the physical distance would be slightly shorter, but I haven't calculated the effect.

GR effects are typically 10-10, and we need something bigger than 10-5.

The first-order GR effect is just plain "falling", and that's 60 microns. We need 60 feet.

xts said:
That is also something smelling a bit fishy in OPERA experiment. At CERN they measure strong pulse of millions of muons, while at LNGS they detect single muons. So the shape of the signal, its discrimination, etc. plays pretty significant role. The paper again does not explain clearly how do they correct for such issues.

Again, this is trivial, and probably not worth reporting in detail. Anyone competent can do this. If you have a CFD, it's easier, otherwise, you measure the slewing and correct for it. It's also hard to get a 60ns offset from a device that has a rise time of a few ns.
 
  • #88
Vanadium 50 said:
The first-order GR effect is just plain "falling", and that's 60 microns. We need 60 feet.

Oh, well, another beautiful theory spoiled by an ugly fact. :wink: Thanks, this gives a good sense of the relative order of magnitude of GR effects for this experiment.
 
  • #89
Did anyone else watch the press conference? I watched it, although I didn't fully understand what they were talking about it sounded fairly good, being very cautious in their claims and very open to questions.
 
  • #90
Vanadium 50 said:
GR effects are typically 10-10, and we need something bigger than 10-5.
Also, things like Sagnac effect are several orders of magnitude too small.
 
  • #91
Just one more (silly) doubt.

They base on a collection of independent measurements, each of them having statistical error of 2.8 microseconds (they come from close to flat distribution of 10.5 microsecond width).

How have they made 6.9 nanosecond of final statistical error, while having only 16,111 events total?
 
  • #92
ColonialBoy said:
One of the diagrams in the paper refers to both locations using GPS to derive local time, and the diagram shows a single satellite. In fact GPS uses at least 3 and possibly 5 satellites, and each satellite has its own atomic clock. The satellites all transmit on the same frequency, and when the signals arrive back on earth, timing differences as well as relativistic effects are combined to give local time.

I'd like to see how accurate they BELIEVE their local clocks are. Sub-nanosecond?

Their local clocks are certainly much better than sub-nanosecond. Note that one nanosecond is huge time-interval in modern time metrology (there are single-chip clocks with an 100s Allen deviation better then 10^-11). Moreover, time transfer with an accuracy better than a few ns is more or less trivial. Hence, it is very unlikely that there are any systematic errors due to time-keeping or time-transfer. Even a normal off-the-shelf commercial GPS clock will conform to the UTC with better than around +-30 ns.

Also, note that both METAS and PTB have been involved in the time keeping/transfer, there is virtually no chance that people from those organizations would both overlook any mistakes since this is quite literally what they do every day (both PTB and METAS operate atomic clocks that are part of the UTC) .
.
 
  • #94
xts said:
Just one more (silly) doubt.

Please look at Figure 11 in the paper.
 
  • #95
gambit7 said:
Is it a viable check to undertake the suggestion of replicating the energies of the 1987 supernova?

No. The energies are too small by a factor of 1000. The neutrinos cannot be efficiently produced nor efficiently detected at these energies.
 
  • #96
Here's an interesting interview with Ereditato & Autiero posted on Youtube:



(not many details about the experiment itself, but you can see how open minded they are about the results...)

This one gives a broader description of CERN's neutrino experiments/OPERA (for those of us without advanced degrees in physics):

 
Last edited by a moderator:
  • #97
OK can anyone explain this to me (and maybe a few others) - have I got it right?

The explanation of the astronomical evidence - supernova explosion - is not that obvious.
I mean the neutrino pulse did arrive on Earth before the light pulse. As it would if the neutrinos were faster than light. And to be only 3h apart after 160,000 years means they are impressively close in speed. And we can accept the light's excuse for lateness, that it got held up in this terribly dense traffic in the supernova (I will try it myself sometime). So that explains it away, I will accept that 3h is a reasonable estimate for such delay. But that is only saying there is no contradiction. We can't calculate nor observe the delay to the nearest billionth of a second I'm sure.

So what Strassler seems to rely on is not the coincidence of the two pulses but the fact that the neutrinos arrived closely bunched, is that right? Now I know from scintillation counting that beta decays give off [itex]\beta[/itex]'s with a spectrum of energies and I suppose the neutrinos have a spectrum of kinetic energies. If they have a spectrum of energies they must have a spectrum of velocities. But the observed spectrum of velocities is very narrow. So if what happens in supernovae is like what happened in my remembered scintillation counting and there is a spectrum of energies, the way you can have a broad energy spectrum and narrow velocity spectrum is, by SR, when they are traveling close to the speed of light.

Was something like that the implicit argument?

So close to speed of light, their rest mass must be very small.

But close to doesn't quite tell me slower than or faster than. :confused:
 
Last edited:
  • #98
Vanadium 50 said:
(silly doubt: 6.9ns error from 16,000 sample of 2800ns errored measurements) Please look at Figure 11 in the paper.
Fig.11? It illustrates, that data shifted by 1048ns correction optically fit to the prediction, while originally they were shifted, and is not related to statistical errors.

I just doubt how you may average 16,000 sample of sigma=2800ns distribution to got final result with sigma 6.9ns. I would rather expect to have the final sigma at least sigma/sqrt(N) = 22ns.

The pulses are 10.5 microseconds wide. The protons (=> created neutrinios) are not distributed uniformly over that span, but as shown on an example pulse (fig.4) - its sigma is definitely bigger than 875ns (which would be the maximum for single event, allowing for final result sigma=7ns in absence of any other sources of statistical errors)
 
Last edited:
  • #99
One thing I don't quite understand..

They take the variance near the central value of the maximum likelihood function by assuming it is like a gaussian. OK. But why are they justified in using this value for the statistical uncertainty in delta t? Naively it seems like they are throwing out most of the information.
 
  • #100
Haelfix said:
One thing I don't quite understand..

They take the variance near the central value of the maximum likelihood function by assuming it is like a gaussian. OK. But why are they justified in using this value for the statistical uncertainty in delta t? Naively it seems like they are throwing out most of the information.

Well, they're just experimentalists! ;)

They are merely publishing the results they've been having, they are not interpreting them.
 
  • #101
Can I ask a rather simple question? Until now, have neutrinos ever been observed at low speeds or rest? Or do we always see them travel at the speed of light, give or take small differences?
 
  • #102
McLaren Rulez said:
Can I ask a rather simple question? Until now, have neutrinos ever been observed at low speeds or rest? Or do we always see them travel at the speed of light, give or take small differences?

I sincerely do not know, but if you're trying to infer a point with your question, I fail to see it.

The issue here is not the neutrino velocity, it is the apparent fact that it travels at superluminal speed, which should be impossible.
 
  • #103
V50's posts have convinced me there has to be an error somewhere. The numbers just do not match up with previous experiments and the supernova data. Massive photons could be possible, but this experiment would exceed the upper bound on their mass supported by so many other, more accurate trials. The interesting questions are what the error is, how it could be so subtle as to trick so many scientists and engineers, and whether or not it affects other experiments or equipment.

As far as further experiments, would using the same setup/equipment over a longer distance quickly reveal systemic error? It seems to me that an error in the experiment's timing would not scale with the distance the neutrinos travel. So if we move the detector twice as far away and the neutrinos still arrive 60ns early instead of 120ns, would we have very strong evidence to support error? The Earth's diameter is over 17x the distance these neutrinos traveled, building another emitter or detector on the far side of the planet would yield better timing allowances. Am I right?
 
  • #104
How are they insuring that the neutrinos in Gran Sasso are the same neutrinos from CERN? There is no way to tag these objects. If there are billions of neutrinos passing through our eyes every second, is it possible that this could be neutrinos from another source?

This answer just seems too obvious, but how are they confirming that the neutrinos from CERN are the same as the ones at Gran Sasso.
 
  • #105
eiyaz said:
How are they insuring that the neutrinos in Gran Sasso are the same neutrinos from CERN? There is no way to tag these objects. If there are billions of neutrinos passing through our eyes every second, is it possible that this could be neutrinos from another source?

I imagine there is some background-level of neutrinos in the detector, and spikes in activity correspond with CERN's emission timing. Over thousands of bursts, you can be certain that it's coming from CERN.

As an example, billions of photons pass through your cell phone every second, this doesn't stop it from being able to discriminate a signal from the cell tower.
 
<h2>What is CERN and why is it important?</h2><p>CERN (European Organization for Nuclear Research) is a European research organization that operates the largest particle physics laboratory in the world. It is important because it conducts groundbreaking experiments and research in the field of particle physics, leading to new discoveries and advancements in our understanding of the universe.</p><h2>What is the measurement of neutrino speed >c and why is it significant?</h2><p>The measurement of neutrino speed >c refers to the finding by the CERN team that neutrinos, a type of subatomic particle, were observed to travel faster than the speed of light. This goes against the widely accepted theory of relativity and could potentially revolutionize our understanding of physics and the laws of the universe.</p><h2>How did the CERN team conduct this measurement?</h2><p>The CERN team used a particle accelerator called the Large Hadron Collider (LHC) to create a beam of neutrinos and then measured the time it took for the neutrinos to travel a distance of 730 kilometers to the OPERA detector in Italy. They repeated this experiment multiple times and found that the neutrinos consistently arrived earlier than expected, indicating a speed faster than light.</p><h2>What are the potential implications of this measurement?</h2><p>If the measurement of neutrino speed >c is confirmed, it could potentially challenge our current understanding of the laws of physics and force us to rethink our theories. It could also open up new possibilities for faster-than-light travel and communication.</p><h2>Has this measurement been confirmed by other scientists?</h2><p>No, this measurement has not been independently confirmed by other scientists yet. The CERN team has invited other researchers to replicate the experiment and verify their findings, and the scientific community is eagerly awaiting further evidence and validation of this groundbreaking discovery.</p>

What is CERN and why is it important?

CERN (European Organization for Nuclear Research) is a European research organization that operates the largest particle physics laboratory in the world. It is important because it conducts groundbreaking experiments and research in the field of particle physics, leading to new discoveries and advancements in our understanding of the universe.

What is the measurement of neutrino speed >c and why is it significant?

The measurement of neutrino speed >c refers to the finding by the CERN team that neutrinos, a type of subatomic particle, were observed to travel faster than the speed of light. This goes against the widely accepted theory of relativity and could potentially revolutionize our understanding of physics and the laws of the universe.

How did the CERN team conduct this measurement?

The CERN team used a particle accelerator called the Large Hadron Collider (LHC) to create a beam of neutrinos and then measured the time it took for the neutrinos to travel a distance of 730 kilometers to the OPERA detector in Italy. They repeated this experiment multiple times and found that the neutrinos consistently arrived earlier than expected, indicating a speed faster than light.

What are the potential implications of this measurement?

If the measurement of neutrino speed >c is confirmed, it could potentially challenge our current understanding of the laws of physics and force us to rethink our theories. It could also open up new possibilities for faster-than-light travel and communication.

Has this measurement been confirmed by other scientists?

No, this measurement has not been independently confirmed by other scientists yet. The CERN team has invited other researchers to replicate the experiment and verify their findings, and the scientific community is eagerly awaiting further evidence and validation of this groundbreaking discovery.

Similar threads

  • Special and General Relativity
Replies
14
Views
2K
  • Quantum Physics
Replies
1
Views
2K
  • High Energy, Nuclear, Particle Physics
Replies
4
Views
4K
Replies
16
Views
3K
  • Beyond the Standard Models
Replies
30
Views
7K
Back
Top