dan_b said:
Hi Michel,
Likelihood function = probability density function. Just a different name maybe with different normalization. I apologize in advance because I don't think you're going to like this link very much. I don't. It has an approach which obscures the intuition if you not comfortable with the math. It also has links which may be useful. Keep following links, use Google search on the technical terms, and eventually you'll find something you're happy with. Try starting here:
http://en.wikipedia.org/wiki/Probability_density_function
Thanks dan_b, I appreciate a "back-to-the-basics" approach as opposed to the crazy speculations we can see here and there.
I am of course well aware about statistics and probabilities.
My interrest was more about an explicit form for the Lk or wk functions mentioned in the paper.
My main aim was to check, black on white, how the time of flight actually could be measured, where the information actually comes from.
My guess is that it simply mimicks the waveshape of the proton beam intensity.
However, I am a little bit lost in the (useless) details.
I can't even be sure if the SPS oscillations carry useful information and if these were actually used.
The whole thing can probably be exposed in a must simpler way, without the technicalities.
A simpler presentation would make it easier to show where the mistake in this paper lies.
I could not find any OPERA writing about this specific likelihood function.
However, I saw that such likelihood functions are probably of common use for other kind of analysis in particles physics and more specifically for the neutrinos experiments. It seems to be a common technique of analysis that is re-used here. Therefore, I would be very cautious before claiming loud that they made a mistake.
Nevertheless, the figure 12 in the paper suggests me that the statistical error is much larger than what they claim (see the guardian) and that -conversly- the information content in their data is much smaller that what we might believe.
From the 16111 events they recorded, I believe that only those in the leading an trailing edge of the proton pulse contain information (at least for the figure 12 argument).
This is less than 1/10 of the total number of events: about 2000 events.
Obviously, concluding from only 2000 events would drastically decrease the precision of the result. In is therefore very striking to me that the influence of the number of event (16000 or 2000) on the precision of the results is not even discussed in the paper. The statistical uncertainties are certainly much larger than the systematic errors shown in table 2 of the paper.
Therefore, it is at least wrong to claim it is a six-sigma result.
I would not be surprised it is a 0.1 - sigma result!
In addition to the lower number of useful events (2000) as explained above, it is also obvious that the slope of the leading and trailing edges of the proton pulse will play a big role. If the proton pulse would switch on in 1 second, it would obviously be impossible to determine the time of flichgt with a precision of 10ns and on the basis of only 2000 events.
But in this respect, the leading time is actually of the order of 1000 ns !
For measuring the time of flight with a precision of 10 ns, and on the basis of only 2000 events, I am quite convinced that a 1000 ns leading edge is simply inappropriate.
I have serious doubts about this big paper, and it would be good to have it web-reviewed!
Michel
PS
For the math-oriented people: is there a way to quantify where the information on the time of flight comes from in such an experiment? For example, would it be possible to say that the information come for -say- 90% from the pulse leading and trailing edge data and for 10% from the SPS oscillations? And is it possible to correlate this "amount of information" to the precision obtained?