CERN team claims measurement of neutrino speed >c

  • Thread starter turbo
  • Start date
  • #201
f95toli
Science Advisor
Gold Member
2,948
442
As I have said before, the application of using GPS to synchronize two distant stations to a nanosecond is not common, and as such I am less confident that the firmware in the unit is bug-free than had the application been more widely used.
I'd say it is very common. GPS is one of two systems used for time transfer in the UTC itself meaning this is done routinely. Granted, it is the less accurate system, but the reason they used in here is presumably because it is good enough. Note that their clocks were calibrated by PTB and METAS and checked by movable time transfer. Hence, I think we can be pretty sure that someone would have told them if they were doing something wrong.
 
  • #202
28,902
5,160
Mmm, but how come then that they find agreement with a land-based survey, which measures in the rotating frame, as they say, in the paper ?
The differences in distance Between the land based and gps surveys is several orders of magnitude too small.
 
  • #203
PeterDonis
Mentor
Insights Author
2019 Award
27,841
7,713
I'm just not sure that there is a general acceptance that a path which passes through a region of (albeit slightly) lower gravitational potential might be shorter than the calculated distance between two points on the surface of geoid.
Whatever effect there might or might not be from this, it is way too small to matter. See post #177; at most a change in path length due to the change in gravitational potential would be about a 10^-10 effect.
 
  • #204
PAllen
Science Advisor
2019 Award
7,910
1,199
Right. I stand corrected. I didn't realize that the GR effect was important here, as Vanadium stated that gravitational effects account for something like 10^-10 and I took that for granted.

However, SR effects account for about 10^-6 (relative velocities), so if what you say is correct, this means that GR effects are also of the order of 10^-6 for a depth of 20 something km. Now, the chord of a 700 km arc dips about 10 km deep into the earth, so one would expect then a similar GR correction to the interval.
How do you get the SR effect you claim? For gamma to differ from 1 by 1 part in 10^7, I get a required relative speed of 83 miles per second. For a relative speed of 1000 mph, gamma differs from 1 by 1 part in 10^12 or so.
 
  • #205
vanesch
Staff Emeritus
Science Advisor
Gold Member
5,028
16
Second, the difference between a rotating earth frame and a stationary frame is essentially irrelevant. If you draw the space-time diagram for the setup, including
the GPS satellites (one is enough if you assume it's already synchronized) you will discover what they are measuring is very close to the interval between emission and detection, which is a Lorentz invariant.
Well, I don't want to know how the GPS system actually works, what counts is what is the result of it. If it gives you the synchronised reference time in a stationary frame, then you assume that they have build in all necessary corrections to do so.

What I wanted to say was that if you "synchronize" in a stationary reference frame Oxyzt, which means that at events "Emission" and "Reception" you measure "t" (the t of the reference frame Oxyzt), but you measure the distance between "Emission" and "Reception" in a frame Ox'y'z't' using worldlines of stationary points (that is, with 0 velocity in frame Ox'y'z't') so that it is easy to measure that distance in that frame, then you cannot combine this distance measured in Ox'y'z't' with a time measured on Oxyzt.

My question was what kind of time coordinate (in what kind of frame) is used in the GPS system (no matter how they actually do it, assuming they do it right), and I thought that it was only possible in an intertial frame. However, I stand corrected, this can also be a time on a rotating geode which also contains another "universal time" as I forgot about the GR correction.

But it DOES matter what reference frame one uses to define "synchronised time", because mixing a time coordinate from one frame and a distance from another is at the origin of all "paradoxes" in introductory SR, such as the pole-barn paradox and the like.

There are two corrections that need to be applied - one is the fact that LGNS is moving 50mph faster than CERN because of the earth's rotation: that's a 10-15 effect.
Which should then according to DH be annihilated by the geode effect.

The other is that the earth has moved between the emission and detection times by a few feet. That should be properly taken into account by the GPS receiver (and I have questioned this), and if it is, it's a 10-6 effect on the 10-5 effect, or 10-11.
That's if you're working in an inertial frame ! If you work in the rotating frame that is not the case. This is why defining the correct reference frame is so important, and rather tricky in this case.

The point is not that I think I'm smarter than those guys, it is just that nothing of all this was mentioned in the paper.
 
  • #206
vanesch
Staff Emeritus
Science Advisor
Gold Member
5,028
16
How do you get the SR effect you claim? For gamma to differ from 1 by 1 part in 10^7, I get a required relative speed of 83 miles per second. For a relative speed of 1000 mph, gamma differs from 1 by 1 part in 10^12 or so.
I was talking about beta and thought about a series development in beta, but now you come to say it, for most relativistic corrections the first non-zero term term is beta-squared. So this pushes the effects indeed in the 10^-12 range or so.

I guess this closes the discussion about a relativistic effect due to earth's gravity or rotation...
 
  • #207
11
0
Hello,

How was it possible to measure the time of flight with a 10ns precision, based on theis 10 µs proton pulse?

Thanks for your help.

Michel

(before eventually re-starting a specific thread focusing on data analysis)
One pulse does not have good enough signal to noise ratio to get a time of flight precision of a few ns. The proton pulse actually doesn't have much current in terms of everyday lab measurements, although it's a huge current in terms of teravolt particles. So they made a model by using many emitter pulses, and comparing many receive pulses to it. Several people pointed out that there can be hidden assumptions when that is done. For example, one hidden assumption might be that the emitter pulse is invariant shape, except for band-limited Gaussian noise. If that's wrong, then the mathematical processing used to put together the 'average' of many pulses goes a little wrong and might make a bias which could be unaccounted for.

All my tentative "might' and "could' words are because, they're smart guys and maybe they already did it just right, but a paper with that level of total detail in it would be unreadable! There's deep exam questions here about experimental technique, just as should be. It's a lot of work for them to answer even a few of the most carefully considered issues of the critics. This will take time. There is no way around it, and they understand that.
 
  • #208
Vanadium 50
Staff Emeritus
Science Advisor
Education Advisor
2019 Award
23,852
6,297
I'd say it is very common. GPS is one of two systems used for time transfer in the UTC itself meaning this is done routinely. Granted, it is the less accurate system, but the reason they used in here is presumably because it is good enough.
That's the point - who is using something more complicated than something you buy at Fry's for this particular application? The bigger the market for this, the less likely something is odd in the firmware.
 
  • #209
13
0
They said they used a 3-D coordinate system, which implies they considered this.
Sorry, didn't know that. But another problem arises with the use of GPS. The satellites which are making these measurements may slip a bit in their orbits - they are not in absolutely perfect geostationary orbits. Even a deviation of [itex]\pm[/itex]1 meter could have an enormous effect on the accuracy of the neutrino reading.
 
  • #210
D H
Staff Emeritus
Science Advisor
Insights Author
15,393
682
GPS satellites are not in geosynchronous orbits.

Whatever mistake was made, if a mistake was made, was quite subtle. That group has been building up this data for a few years. They looked for obvious explanations, not so obvious explanations, asked outside groups for help, and still couldn't find anything that explained their results.

I'm guessing that they did do something wrong. I'm also guessing that we at PhysicsForums will not be the ones to ferret that mistake out.
 
  • #211
13
0
GPS satellites are not in geosynchronous orbits.
It does not matter, there may be +/- a few meters of orbital deviation.

Whatever mistake was made, if a mistake was made, was quite subtle. That group has been building up this data for a few years. They looked for obvious explanations, not so obvious explanations, asked outside groups for help, and still couldn't find anything that explained their results.

I'm guessing that they did do something wrong. I'm also guessing that we at PhysicsForums will not be the ones to ferret that mistake out.
True.
 
Last edited:
  • #212
294
29
They said they used a 3-D coordinate system, which implies they considered this.
As I mentioned earlier in this thread, they said in the presentation that they corrected for GR due to the height difference, and that the correction was on the order of 10^-13.
 
  • #213
D H
Staff Emeritus
Science Advisor
Insights Author
15,393
682
It does not matter, there may be +/- a few meters of orbital deviation.
No. Try centimeters.

Furthermore, the errors in the orbit estimations are irrelevant here. Those experimenters used common view mode, which reduces errors in both relative time and relative position by orders of magnitude. Common view mode, relative GPS, and differential GPS have been around for quite some time. The basic concept is thirty years old, but not the 10 nanosecond accuracy claimed by the experimenters.
 
  • #214
294
29
The basic concept is thirty years old, but not the 10 nanosecond accuracy claimed by the experimenters.
In the presentation they said that this precision was common place, just not in the field of particle physics. Did I misunderstand?
 
  • #215
13
0
No. Try centimeters.

Furthermore, the errors in the orbit estimations are irrelevant here. Those experimenters used common view mode, which reduces errors in both relative time and relative position by orders of magnitude. Common view mode, relative GPS, and differential GPS have been around for quite some time. The basic concept is thirty years old, but not the 10 nanosecond accuracy claimed by the experimenters.
The software could have been buggy, it may be like that for something which is not commonplace like that. There are a thousand other factors which could affect the results. No single factor was responsible for this.
 
  • #216
11
0
Thanks dan_b.
I could not locate a paper describing the "likelihood function" with seems to be the basis for their analysis. Would you have some track for such a paper, or would you have some personal idea about it? ....

Michel
Hi Michel,

Likelihood function = probability density function. Just a different name maybe with different normalization. I apologize in advance because I don't think you're going to like this link very much. I don't. It has an approach which obscures the intuition if you not comfortable with the math. It also has links which may be useful. Keep following links, use Google search on the technical terms, and eventually you'll find something you're happy with. Try starting here:

http://en.wikipedia.org/wiki/Probability_density_function
 
  • #217
Borg
Science Advisor
Gold Member
1,871
2,220
I have been reading about the accuracy of the GPS timestamps. I’m not sure what to think about two pieces of information. I’m highlighting my concerns below.

Page 9 of the OPERA paper (http://arxiv.org/pdf/1109.4897v1) states :

The Cs4000 oscillator provides the reference frequency to the PolaRx2e receiver, which is able to time-tag its “One Pulse Per Second” output (1PPS) with respect to the individual GPS satellite observations. The latter are processed offline by using the CGGTTS format [19]. The two systems feature a technology commonly used for high-accuracy time transfer applications [20]. They were calibrated by the Swiss Metrology Institute (METAS) [21] and established a permanent time link between two reference points (tCERN and tLNGS) of the timing chains of CERN and OPERA at the nanosecond level.

Reference [19] led me to this paper (ftp://ftp2.bipm.org/pub/tai/data/cggtts_format_v1.pdf[/URL]) on CGGTTS formats. The conclusion on page 3 states:

[I]The implementation of these directives, however, will unify GPS time receiver software and avoid any misunderstandings concerning the content of GPS data files. Immediate consequences will be an improvement in the accuracy and precision of GPS time links computed through strict common views, as used by the BIPM for the computation of TAI, and improvement in the [B]short-term stability of reference time scales like UTC[/B].[/I]

I didn't see any references to the calibration of the PolaRx2e receivers other than the 2006 calibration. It looks to me like they used a calibration that was good for short-term stability and used it over the course of four years. Am I misreading this?
 
Last edited by a moderator:
  • #218
Astronuc
Staff Emeritus
Science Advisor
18,705
1,720
Funny that a newspaper, the guardian, can have such relevant comments.
Read this:

http://www.guardian.co.uk/science/life-and-physics/2011/sep/24/1

. . . .
The author of the article, Jon Butterworth makes some good points:
  • What would it mean if true? (certainly worth considering, but without being overly speculative)
  • Isn't this all a bit premature? (a point that is made numerous times in this thread)
  • What might be wrong? (again - a point that is made numerous times in this thread)
and as a postscript to the article.
I received a comment on this piece from Luca Stanco, a senior member of the Opera collaboration (who also worked on the ZEUS experiment with me several years ago). He points out that although he is a member of Opera, he did not sign the arXiv preprint because while he supported the seminar and release of results, he considers the analysis "preliminary" due at least in part to worries like those I describe, and that it has been presented as being more robust than he thinks it is. Four other senior members of Opera also removed their names from the author list for this result.
Butterworth is a frequent contributor to the Guardian - http://www.guardian.co.uk/profile/jon-butterworth
 
  • #219
11
0
Regarding 8.3 km of fiber optic, I did some looking. Admittedly we don't know what kind of cable it is, and they do vary in temperature coefficient of delay (TCD) from one type to another. A good quality cable may have TCD = 0.5e-6/C. The cable delay is roughly 30 us. So 0.5e-6/C makes about 0.015 ns/C of temperature dependent delay. That's too small to worry about.

Back to assumptions about the proton pulse shape consistency. How much might the shape change as a function of anything slow which might subsequently mess up the ability to model and average? Temperature? CERN grid voltage? Other effects?
 
  • #220
D H
Staff Emeritus
Science Advisor
Insights Author
15,393
682
Only because the speed of light has always been assumed to be at the SR invariant speed "c".
Assumed? Do you really think that physicists would let such a critical assumption go untested?

A very brief history of physics in the latter half of the 19th century: The development of electrodynamics threw a huge wrench into the physics of that time. Electrodynamics was incompatible with Newtonian mechanics. It was Maxwell's equations (1861), not Einstein's special relativity (1905), that first said that c, the speed of electromagnetic radiation, was the same for all observers. People, including Maxwell, tried to rectify this incompatibility be saying that Maxwell's equations described the speed of light relative to some luminiferous aether. The Michelson–Morley experiment pretty much put an end to that line of thinking. Various other lines of thinking, now abandoned, gave ad hoc explanations to somehow rectify electrodynamics and Newtonian mechanics.

Einstein's insight wasn't to magically pull the speed of light as constant out of some magician's hat. His insight was to tell us to take at face value what 40 years of physics had already been telling us: The speed of light truly is the same to all observers. Refinements of the Michelson-Morley experiment has born this out to ever higher degrees of precision.

The modern view is that there will be some speed c that must be the same to all observers. In Newtonian mechanics, this was an infinite speed. A finite speed is also possible, but this implies a rather different geometry of spacetime than that implied by Newtonian mechanics. Massless particles such as photons will necessarily travel at this speed. Massive particles such as neutrinos can never travel at this speed. Photons are massless particles not only per theory but also per many, many experiments. That neutrinos do indeed have non-zero mass is a more recent development, but once again verified by multiple experiments.
 
  • #221
Vanadium 50
Staff Emeritus
Science Advisor
Education Advisor
2019 Award
23,852
6,297
the cern experiment does strike me as a novel experiment. I mean really, can anyone cite an experiment where someone beamed anything through the earth like this before?
minos
t2k.
 
Last edited by a moderator:
  • #222
3
0
This is not strictly true. When a particle is first created in a nuclear reaction it will generally have some non-zero initial velocity. That said, regardless of the initial velocity you are correct about the energy requirements to accelerate it further, but they are not claiming faster than c, only faster than light. The implication being that light doesn't travel at c.
I agree with the approach taken here. The most dangerous conjecture so far was taking one single baffling iteration of an experiment as possible (fine if we are to construct a road map) and dumping a whole bunch of extraordinary results on top of it. We jumped straight to photons have mass on the first page!

Anyway I think this would have to be pretty close to the starting point. The whole implication can't be to throw out c is the speed limit but that observed photons don't travel at c. This may lead to that we may have to redefine how we interpret 'vacuum'. This, I think, would imply that neutrinos have mass (i.e. not affected by the whole vacuum issue as much, like neutrons scattering light in a nuclear reactor due to them moving faster than the photons in a non-vacuum)- something we are far more prepared for than 'c doesn't hold, let's scrap sr/gr'. In any event, it would be a very, very long and messy path of retrofitting theories before we can even consider scrapping any part of sr/gr. We have to address the 'frame' the neutrino travels in. Do we know enough about the neutrino to claim that it actually moved ftl. It may have 'appeared' to move ftl but we know that ftl travel is possible just not locally under gr.

If (a remote chance) this is true I'd bet it is far more likely going to have implications on the nature of the neutrino, possibly even the graviton (another very long shot), than forcing a rework of a century's worth of work. So if you are keeping score at home we are at (long shot)^4, and we haven't even dealt with (long shot) so lets not get our panties in a bunch here.
 
  • #223
550
2
To be honest, these news doesn't seem all that surprising to me. Even before this measurement, there was already a number of strange things concerning neutrinos which are not consistent with special relativity. To name two, the mass of neutrinos was measured to be non-zero, yet it seems they can travel long distances with the same ease light does.
According to SR, the speed of a particle is given by [itex]v(E) = c\sqrt{1-\frac{m^2 c^4}{E^2}}[/itex]. Any particle with very low mass and energy large enough to measure will necessarily travel at a speed very close to c.

The other one is that probable mass-square of neutrino was repeatedly measured to be negative. It's sad that it takes a sensation like this, to get the scientific community excited enough, to actually try and explain these discrepancies, while they are, at the core, all of the same nature.
Every previous experiment attempting to directly measure [itex]m^2[/itex] for individual neutrino states has had a result within [itex]2\sigma[/itex] of 0. A tendency toward null results could simply indicate a tendency of such experiments to slightly underestimate the neutrino energy (or overestimate its momentum). In any case, all such results are effectively null and really can't be expected to be taken as evidence for exotic neutrino properties.
 
  • #224
1,227
2
Hi Michel,

Likelihood function = probability density function. Just a different name maybe with different normalization. I apologize in advance because I don't think you're going to like this link very much. I don't. It has an approach which obscures the intuition if you not comfortable with the math. It also has links which may be useful. Keep following links, use Google search on the technical terms, and eventually you'll find something you're happy with. Try starting here:

http://en.wikipedia.org/wiki/Probability_density_function
Thanks dan_b, I appreciate a "back-to-the-basics" approach as opposed to the crazy speculations we can see here and there.
I am of course well aware about statistics and probabilities.
My interrest was more about an explicit form for the Lk or wk functions mentioned in the paper.
My main aim was to check, black on white, how the time of flight actually could be measured, where the information actually comes from.
My guess is that it simply mimicks the waveshape of the proton beam intensity.
However, I am a little bit lost in the (useless) details.
I can't even be sure if the SPS oscillations carry useful information and if these were actually used.
The whole thing can probably be exposed in a must simpler way, without the technicalities.
A simpler presentation would make it easier to show where the mistake in this paper lies.
I could not find any OPERA writing about this specific likelihood function.
However, I saw that such likelihood functions are probably of common use for other kind of analysis in particles physics and more specifically for the neutrinos experiments. It seems to be a common technique of analysis that is re-used here. Therefore, I would be very cautious before claiming loud that they made a mistake.

Nevertheless, the figure 12 in the paper suggests me that the statistical error is much larger than what they claim (see the guardian) and that -conversly- the information content in their data is much smaller that what we might believe.
From the 16111 events they recorded, I believe that only those in the leading an trailing edge of the proton pulse contain information (at least for the figure 12 argument).
This is less than 1/10 of the total number of events: about 2000 events.
Obviously, concluding from only 2000 events would drastically decrease the precision of the result. In is therefore very striking to me that the influence of the number of event (16000 or 2000) on the precision of the results is not even discussed in the paper. The statistical uncertainties are certainly much larger than the systematic errors shown in table 2 of the paper.

Therefore, it is at least wrong to claim it is a six-sigma result.
I would not be surprised it is a 0.1 - sigma result!

In addition to the lower number of useful events (2000) as explained above, it is also obvious that the slope of the leading and trailing edges of the proton pulse will play a big role. If the proton pulse would switch on in 1 second, it would obviously be impossible to determine the time of flichgt with a precision of 10ns and on the basis of only 2000 events.
But in this respect, the leading time is actually of the order of 1000 ns !!!
For measuring the time of flight with a precision of 10 ns, and on the basis of only 2000 events, I am quite convinced that a 1000 ns leading edge is simply inappropriate.

I have serious doubts about this big paper, and it would be good to have it web-reviewed!

Michel

PS
For the math-oriented people: is there a way to quantify where the information on the time of flight comes from in such an experiment? For example, would it be possible to say that the information come for -say- 90% from the pulse leading and trailing edge data and for 10% from the SPS oscillations? And is it possible to correlate this "amount of information" to the precision obtained?
 
Last edited:
  • #225
f95toli
Science Advisor
Gold Member
2,948
442
I didn't see any references to the calibration of the PolaRx2e receivers other than the 2006 calibration. It looks to me like they used a calibration that was good for short-term stability and used it over the course of four years. Am I misreading this?

They are probably refering to the short-term stability in terms of the Allen deviaion. There is no such thing as a single number for stability; the stability of clocks depends on the time intervall you are interested in (in a non-trivial way). A good example is Rubidium oscillators which are good for short times (say up to tens of seconds) but have signficant drift. Atomic clocks (and the GPS) are not very good for short times, let say as few seconds (and cesium fountains do not even HAVE a short term value due to the way they work; they are not measured continusoly).
Hence, the way most good clocks work (including I presume the one used in the experiment) is that they are built around an oscillator with good short term stability, which is then "disciplined" against the GPS to avoid drift and longer-term instability.

Btw, whenever a single value is given in articles it usually (but not always) refers to the 100s Allen deviation value.

Also, but those of you who still think there is a problem with their time keeping equipment; did you miss the part in the paper where it said their clocks have been indipendently calibrated? AND checked "in-situ" by movable time transfer (which probably means that METAS simply temporarily installed one of their mobile atomic clocks in the OPERA lab for a while).
 

Related Threads for: CERN team claims measurement of neutrino speed >c

Replies
12
Views
877
  • Last Post
Replies
6
Views
8K
Replies
1
Views
1K
Replies
4
Views
1K
  • Last Post
Replies
6
Views
1K
Replies
19
Views
10K
  • Last Post
Replies
1
Views
1K
  • Last Post
Replies
16
Views
3K
Top