B What's Delaying Fermilab's Muon g-2 Results Release?

Click For Summary
Fermilab's E989 experiment is conducting a precision measurement of the muon’s anomalous magnetic moment, with preliminary results initially expected in late 2020. The delay in releasing these results has led to speculation about the reasons, including the possibility of significant findings requiring further verification. Participants in the discussion emphasize the importance of ensuring accuracy before publication, arguing that delays are common in scientific research. The collaboration is expected to announce results in early 2021, with recent updates indicating a new measurement is set for April 7. The anticipation around these results highlights their potential implications for the Standard Model of Particle Physics.
  • #31
This is generating a huge amount of noise in the mainstream press in the UK: national news last night and a feature on breakfast television this morning!

Batten down the hatches Mentors, I see a storm on its way...
 
Physics news on Phys.org
  • #32
AndreasC said:
there's another issue, which is the curvature of the Earth
No, it's a circle. Think about it.
 
  • #33
pbuk said:
No, it's a circle. Think about it.
Oh fair.
 
  • #34
Well, according to a lattice calculation by the Wuppertal group on the leading hadronic contributions, maybe the SM prediction is again closer to the new measurement:

https://www.nature.com/articles/s41586-021-03418-1

It seems as if ##(g_{\mu}-2)## stays exciting also from the theory side!
 
  • Like
Likes Demystifier
  • #35
  • #36
First, this thread started because one member felt that the experiment had the results in January but was withholding them because they were hiding a problem. We know now that was totally untrue. Somebody made it up and then it was used to cast aspersions on the scientific team's competence, integrity, or both.

Second, it is also not the case that all new physics must affect g-2. It's actually quite easy: 2HDM with a light h and H and a heavy A, H+ and H-. One might even say "trivial". I'm not even a theorist and it took me less time to think of one than to type it. It may be relevant that the electroweak contribution is in the seventh significant digit, so a W' and Z' that were a factor of ~3 heavier (long excluded by direct searches) would be invisible here.

Third, there seems to be the feeling that 4.2 sigma means "new physics". If you go to the theory paper (Ref. [13] in the PRL) you can see in Figure 1 that the calculation is well within the "no new physics" band. Also, the BMW collaboration has a calculation they say is right on the money.

Fourth, as Chris Polly said, this assumes there is no problem with the technique. Such a problem does not need to be large - this is a 460 ppb measurement. There is a long history of different techniques giving different results - two recent ones are the neutron lifetime and the proton radius. This is why the JPARC experiment is so important. It would be important even if it were less sensitive than the FNAL experiment (as it stands, the two have comparable targets).
 
  • Like
  • Informative
Likes PeroK, weirdoguy, ohwilleke and 2 others
  • #37
ohwilleke said:
In China, it comes from coal fired power plants with very few emissions controls.
BTW this post of mine reminded me of the great and catchy tune of level 42:
 
  • #40
gmax137 said:
Apparently it depends on who is asked.

Just like the SM prediction for g-2. 😈
 
  • Haha
  • Like
Likes ohwilleke, vanhees71, Demystifier and 2 others
  • #42
mfb said:
Reaching 750 times the energy with the LHC technology would need a ring 750 times as large, ~20,000 km circumference. Even if you double the field strength it's still 10000 km. Europe doesn't have space for that, but in North America it could just fit between Hudson Bay, Mexico, Washington and Washington DC. At least if we ignore all financial and technical problems.

They should build in in outer space :smile:
 
  • #43
gmax137 said:
Not that it is important to this conversation, but no.
Also recall NAFTA the "North American Free Trade Agreement" between Canada, US, and Mexico.
I am always right even when I am not!
 
  • #44
mfb said:
in North America it could just fit between Hudson Bay, Mexico, Washington and Washington DC. At least if we ignore all financial and technical problems.

**** it, let's do it.
 
  • #45
JLowe said:
**** it, let's do it.
It's called dang your curse word... :oldbiggrin:
 
  • #46
I expect JPARC to end up close to the Fermilab value, and eventually most theory predictions to end up at the same value. The BMW prediction is matching the experimental result.

At least the accelerators for g-2 experiments are nice and compact. Here are some 3000 km diameter circles. Note the map distortions.
 
  • Like
Likes ohwilleke and vanhees71
  • #47
But it would be so much more fun if E34 gets the g-2 Theory Initiative value, and FNAL continues to match the BMW value.
 
  • Haha
Likes ohwilleke and vanhees71
  • #48
mfb said:
If there are two SM predictions and only one agrees with measurements...
At least we will learn more about SM predictions of hadronic effects.
Do you have any opinion about this second model? Is real?
https://www.nature.com/articles/s41586-021-03418-1
 
Last edited:
  • #49
That's the above mentioned lattice-QCD calculation of the leading hadronic contribution to ##(g-2)## by the Wuppertal (BMW) lattice-QCD collaboration. It's at least a hint that one has to consolidate the prediction of the theory side. If I understand it right, what's compared to the measurement is a theoretical calculation using empirical input for the said hadronic contributions, which uses dispersion-relation analyses of the data, and afaik that fitting is a tricky business of its own.

Of course also the lattice calculation has to be solidified and maybe also checked by other lattice collaborations since also lattice-QCD calculations are tricky business (I only remind about the long debate about the deconfinement and/or chiral-transition temperature, which finally settled at the lower value of around 155 MeV predicted by the Wuppertal group ;-)).

Whether or not the ##(g-2)## results are really hints for "physics beyond the Standard Model" still seems to stay an exciting question.
 
  • Like
Likes PeroK and exponent137
  • #50
gwnorth said:
Central America is part of North America.
I hate this.

EDIT: To end the argument once and for all, Mexico is part of central southern north America. Either that or southern central north America. Perhaps both.
 
  • #51
vanhees71 said:
That's the above mentioned lattice-QCD calculation of the leading hadronic contribution to ##(g-2)## by the Wuppertal (BMW) lattice-QCD collaboration. It's at least a hint that one has to consolidate the prediction of the theory side. If I understand it right, what's compared to the measurement is a theoretical calculation using empirical input for the said hadronic contributions, which uses dispersion-relation analyses of the data, and afaik that fitting is a tricky business of its own.

Of course also the lattice calculation has to be solidified and maybe also checked by other lattice collaborations since also lattice-QCD calculations are tricky business (I only remind about the long debate about the deconfinement and/or chiral-transition temperature, which finally settled at the lower value of around 155 MeV predicted by the Wuppertal group ;-)).

Whether or not the ##(g-2)## results are really hints for "physics beyond the Standard Model" still seems to stay an exciting question.
Can't wait until I learn enough QFT for all that to not sound like complete gibberish to me!
 
  • Haha
Likes Demystifier
  • #52
How does the relate to LHCb result? I think I get them mixed up. Are they measuring totally separate things that just have to do with muons? Are both sensitive to the same or similar QCD calculations?
 
  • #53
nolxiii said:
Are they measuring totally separate things that just have to do with muons?
Yes.
 
  • Like
Likes vanhees71 and ohwilleke
  • #55
exponent137 said:
What can this article tell us about g-2 disagreement?
https://www.quantamagazine.org/protons-antimatter-revealed-by-decades-old-experiment-20210224/
At least, it can tell us that the hadrons are not explained enough?

(Although we talk about muons, the problem of g-2 disagreement is because of hadrons.)

Not much. The article is about proton structure and the proton parton distribution function (PDF).

The Theory Initiative's white paper is basically looking at the propensity of electron-positron collisions to produce pions and the properties of the pion's produced in order to avoid having to calculate it from first principles, and then extrapolating that to the muon g-2 calculation context, while the BMW calculation is straight up from QCD. The BMW paper argues that the transfer of the electron-positron collision data to the muon g-2 calculation by the Theory Initiative has been done wrong (and an ugly mix of experimental results for parts of a calculation and lattice QCD simulations for other parts of it is certainly an unconventional approach).

In the muonic hydrogen proton radius case, it turns out that the measurement of the proton radius in the muonic hydrogen case was spot on and that the old and inaccurate measurement of the proton radius in ordinary electron hydrogen was the source of the discrepancy. We could be seeing something similar here.
 
  • Like
Likes vanhees71 and exponent137
  • #56
But indeed the largest uncertainty in the theoretical calculation of ##(g-2)## of the muon are the radiative corrections due to the strong interactions (in low-energy language, "the hadronic contributions" or "hadronic vacuum polarization" (HVP). If I understand it right, what's usually compared as "theory" to the data uses a semiempirical approach to determine these hadronic distributions by calculating the needed matrix elements via dispersion relations from measurements of the ##\mathrm{e}^+ + \mathrm{e}^{-} \rightarrow \text{hadrons}## cross section. This is based on very general theoretical input, i.e., the unitarity of the S-matrix, but the devil is in the detail, because it's everything else then easy to use the dispersion relations to get the HVP contributions from data. So I'd be not too surprised, if the systematic unertainty of this procedure turns out to be underestimated. After all there are hints from the lattice (by the Wuppertal/BMW lQCD group) that the HVP contributions may well be such that the discrepancy between "theory" and "data" is practically gone (only about 1 sigma discrepancy). Of course, also lQCD calculations are a tricky business. One must not forget that we talk here about high-accuracy physics, which is never easy to get (neither experimental nor theoretical).
 
  • Like
Likes ohwilleke, websterling, PeroK and 1 other person
  • #57
I am not an expert, but I don't believe that any of the theory calculations (of HVP) are pure ab initio calculations. All are ways of relating low energy measurements (done at places like Serpukhov in the 1960's) to g-2.

Had history evolved differently and we had a g-2 measurement first, we would be discussing whether there was an anomaly in the low-energy Serpukov data.
 
  • Like
Likes Astronuc, vanhees71 and exponent137
  • #58
Vanadium 50 said:
I am not an expert, but I don't believe that any of the theory calculations (of HVP) are pure ab initio calculations. All are ways of relating low energy measurements (done at places like Serpukhov in the 1960's) to g-2.

Had history evolved differently and we had a g-2 measurement first, we would be discussing whether there was an anomaly in the low-energy Serpukov data.
Are you sure?

I had understood lattice QCD to be akin to N-body simulations in cosmology. You discretize the QCD equations and the particles and time intervals and then iterate it. The description of what they did in their pre-print sounds like this is what they did.

Quanta Magazine, interviewing the authors, summarizes what was done by the BMW groups as follows:
They made four chief innovations. First they reduced random noise. They also devised a way of very precisely determining scale in their lattice. At the same time, they more than doubled their lattice’s size compared to earlier efforts, so that they could study hadrons’ behavior near the center of the lattice without worrying about edge effects. Finally, they included in the calculation a family of complicating details that are often neglected, like mass differences between types of quarks. “All four [changes] needed a lot of computing power,” said Fodor.

The researchers then commandeered supercomputers in Jülich, Munich, Stuttgart, Orsay, Rome, Wuppertal and Budapest and put them to work on a new and better calculation. After several hundred million core hours of crunching, the supercomputers spat out a value for the hadronic vacuum polarization term. Their total, when combined with all other quantum contributions to the muon’s g-factor, yielded 2.00233183908. This is “in fairly good agreement” with the Brookhaven experiment, Fodor said. “We cross-checked it a million times because we were very much surprised.”
 
  • #60
ohwilleke said:
quantamagazine.org said:
The researchers then commandeered supercomputers in Jülich, Munich, Stuttgart, Orsay, Rome, Wuppertal and Budapest and put them to work on a new and better calculation. After several hundred million core hours of crunching...
So I have an off-topic question about this computer time. One-hundred million hours is 11,000 years; split between the seven computers mentioned that would be what, 1600 years each. How does that work?
 

Similar threads

  • · Replies 0 ·
Replies
0
Views
2K
  • · Replies 0 ·
Replies
0
Views
3K
Replies
2
Views
2K
  • · Replies 83 ·
3
Replies
83
Views
15K
  • · Replies 17 ·
Replies
17
Views
6K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 18 ·
Replies
18
Views
4K
  • · Replies 49 ·
2
Replies
49
Views
12K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 10 ·
Replies
10
Views
4K