The Latest Higgs Boson Mass Measurement

  • #1
ohwilleke
Gold Member
2,369
1,363
TL;DR Summary
The world average Higgs boson mass is now about 125.27 GeV and is less uncertain than it used to be.
The current Particle Data Group global average measurement for the Higgs boson mass is 125.25 ± 0.17 GeV.

The previous combined ATLAS Higgs boson mass measurement (via the Particle Data Group) was 124.86 ± 0.27 GeV from 2018 (using Run 2 data), and the previous combined CMS Higgs boson mass measurement (from the same source) was 125.46 ± 0.16 GeV from 2020 (using Run 2 data). These measurement were consistent with each other at the 1.9 sigma level. The Run-1 measurement from ATLAS and CMS combined (from the same source) was 125.09 ± 0.24 GeV.

The new ATLAS diphoton decay channel Higgs boson mass measurement is 125.17 ± 0.14 GeV. The new ATLAS combined Higgs boson mass measurement is 125.22 ± 0.14 GeV.

The new ATLAS combined measurement is consistent with the old CMS combined Run-2 measurement at the 1.1 sigma level.

The new measurement should pull up the global average measurement of the Higgs boson mass to about 125.27 GeV and should also reduce the uncertainty in the global average measurement to ± 0.13 GeV or less. This is an uncertainty of roughly one part per thousand.
The mass of the Higgs boson is measured in the H→γγ decay channel, exploiting the high resolution of the invariant mass of photon pairs reconstructed from the decays of Higgs bosons produced in proton-proton collisions at a centre-of-mass energy s√=13 TeV. The dataset was collected between 2015 and 2018 by the ATLAS detector at the Large Hadron Collider, and corresponds to an integrated luminosity of 140 fb−1. The measured value of the Higgs boson mass is 125.17±0.11(stat.)±0.09(syst.) GeV and is based on an improved energy scale calibration for photons, whose impact on the measurement is about four times smaller than in the previous publication. A combination with the corresponding measurement using 7 and 8 TeV pp collision ATLAS data results in a Higgs boson mass measurement of 125.22±0.11(stat.)±0.09(syst.) GeV. With an uncertainty of 1.1 per mille, this is currently the most precise measurement of the mass of the Higgs boson from a single decay channel.
ATLAS Collaboration, "Measurement of the Higgs boson mass with H→γγ decays in 140 fb−1 of s√=13 TeV pp collisions with the ATLAS detector" arXiv:2308.07216 (August 14, 2023) (submitted to Phys. Lett. B).
 
Last edited:
  • Like
  • Informative
Likes pinball1970, DeBangis21, dlgoff and 2 others
Physics news on Phys.org
  • #2
At the risk of getting an answer that's well beyond my comprehension, how do they determine the mass of something like the Higgs boson and other very short lived particles? A very complicated and comprehensive process of adding up of all the 'bits and pieces' that come out of those collisions?
 
  • Like
Likes DeBangis21 and ohwilleke
  • #3
You take their decay products energy and momenta and use [itex]m=\sqrt{ (\sum E)^2-(\sum p)^2}[/itex]
 
  • Like
  • Informative
Likes pinball1970, DeBangis21, ohwilleke and 3 others
  • #4
Vanadium 50 said:
You take their decay products energy and momenta and use [itex]m=\sqrt{ (\sum E)^2-(\sum p)^2}[/itex]
Correct, as far as it goes.

First, you need to do many, many billions or trillions of collisions to get enough events with the decay products you are looking for to be statistically significant in the final result.

Then, you screen out from the data for the total output of a collider to focus on a particular set of decay products, such as diphotons or four lepton products. A computer classifies each decay output by the particles decayed after a collision and sorts them for you into a neat data set that you then analyze. A considerable amount of scientific skill goes into picking as much of the data as you can use, and as little data as is irrelevant to the question you want to answer, from the total data set. You also need to be careful to select data in a manner that does not subtly introduce systemic bias into the measurement (e.g. by selecting data in a way that doesn't properly consider false negatives and false positives that you can quantify the likelihood of occurring in a way that makes the studied sample events more massive or less massive than the true complete sample when events that you know the system got wrong are considered).

Once you do that, for each event, you have to apply the formula above (this is also done automatically by a computer for each event), and you bin the results by total mass-energy in the decay products you are studying and determine the number of events in each bin.

Lots of scientist labor goes into the first three tasks, but it is front end loaded in setting up the particle collider equipment, and programming the computer that processes the raw data from the collectors in the experiment, and determining things like the false negative and false positive rates for particular detectors using well understood decays as benchmarks to calibrate the detector. After the initial set up, there is just run after run of collisions to do on autopilot, routine maintenance of the equipment, and fine tuning and debugging the software.

Then, you have to separate signal from background, so you can distinguish between diphotons from the process you are looking for from the ones you take for granted from other processes, since the collider produces a mess of results from every possible set of collision products and there is more than one way to produce any possible set of decay products (although some sets of decay products, like quad-leptons and diphotons have smaller backgrounds than others making them cleaner to measure with less background uncertainty).

Calculating the backgrounds is an immense undertaking which is a scientist labor intensive task. But it only has to be done once for each set of possible backgrounds. Once you compare the data to the backgrounds, it produces a chart like this one:
Screenshot 2023-08-21 at 4.34.37 PM.png

Then, you statistically separate out signal events from background events, by subtracting out the expected number of background events in each bin from the data, and get a chart like the one below, in which you statistically fit your binned results to a curve (this particular chart simulates two different possible predicted results, instead of showing the actual results in this measurement):
Screenshot 2023-08-21 at 5.44.41 PM.png

This curve (but with real data) is a resonance of an unstable particle, which is modeled with a Breit-Wigner distribution.

The distribution's two dimensional shape, in relation to the energy and frequency coordinates, actually gives you two pieces of information we care about. The peak is the best fit for the "rest mass" of the particle, as this is typically defined. The distance between the left and right parts of the curve at the appropriate Y-axis coordinate between the peak and the zero events line is the "width" of the distribution. The inverse of the width, in the proper units, gives you the mean lifetime of the particle via the kind of decay shown, before it decays (for the Higgs boson, the theoretically predicted width is about 4 MeV for all possible types of decays combined). But, determining the width is more complicated than that because you have to statistically distinguish between apparent width due to systemic and statistical error from intrinsic distributions in the mass-energy of the measured events which is the Breit-Wigner width.

Another very scientist labor intensive task that I omit in the description above, is to determine the uncertainties from all significant sources of uncertainty in your result and quantify them. This gives you a table like the one below, plus a separate calculation of the statistical uncertainty of the result:

Screenshot 2023-08-21 at 4.58.53 PM.png

Each of the uncertainties are then combined into a chart like this one which is used to determine the combined and statistical uncertainties of the result at a particular sigma level:
Screenshot 2023-08-21 at 4.37.24 PM.png


Correctly estimating systemic uncertainties is probably the hardest part of the entire measurement process (or theoretical prediction estimate) and is a mix of art and science. Any time you see an anomalous result (in terms of sigmas of deviation from the expected value or previous measurements), underestimation of systemic error is always at the top of the list of possible culprits that doesn't require "new physics" to be invoked, for the outlier value of the measurement.

All of the charts shown above are from the new paper cited in #1 of this thread.
 
Last edited:
  • Like
  • Informative
Likes Davephaelon, apostolosdt, weirdoguy and 3 others
  • #5
So, why is this important? What does knowing the mass to less than a quarter of a percent tell us that knowing the mass to a quarter of a percent didn't?
 
  • Like
Likes vanhees71 and AndreasC
  • #6
Vanadium 50 said:
So, why is this important? What does knowing the mass to less than a quarter of a percent tell us that knowing the mass to a quarter of a percent didn't?
Are you asking why this is actually important, or are you asking why ohwilleke made their post?
 
  • Like
Likes weirdoguy and vanhees71
  • #7
Both, I think. Out of hundreds or thousands of results, why focus on this one?
 
  • #8
If you have a hammer, everything looks like a Higgs Boson.
 
  • #9
I'm in favor of doing the best measurement you can. What I am trying to get is why this particular number is interesting enough to be highlighted. What do we know now that we didn't know before?
 
  • Like
Likes vanhees71
  • #10
Do you mean highlighted here or published in general?
 
  • #11
Vanadium 50 said:
So, why is this important? What does knowing the mass to less than a quarter of a percent tell us that knowing the mass to a quarter of a percent didn't?
First of all, this is one of the fundamental moving parts of the Standard Model. It is in the same category in that respect as the W and Z boson masses, the strong force coupling constant, or the neutrino masses. Likewise, it makes it more important than say, a sigma hadron mass, which is a derived quantity that is rarely an input into anything else.

The fact that this is a quantity that has only been measured at all for about a decade and is not known with ultra-precision yet (unlike, for example, the fine structure constant) also means that changes in the value are big enough to have any kind of relevance.

And, it has been three years since the last major update of its value, so it is something that doesn't happen all that often.

Second, as a result of it being fundamental, it is an input to other things like the Standard Model prediction from other constants of the W boson mass, and the theoretical predictions for the decay fractions of the Higgs boson mass that we are actively generating experimental data to compare to these predictions.

Third, precision is important if you want to look for deep connections between the fundamental constants of the Standard Model and possibly determine relationships that make them more than random numbers. For example, the sum of the Higgs Yukawas is very close to exactly 1, but that sum is very sensitive to uncertainties in the Higgs boson mass. In general, the greater the precision with which you have measured fundamental constants, the less likely it is the a numerical relationship between them is just a fluke.

Finally, any measurement can be interesting if you explain how it was made because it illustrates how HEP is done in practice.
 
  • Like
Likes pinball1970 and vanhees71
  • #12
First, the electron mass is also a fundamental parameter of the standard model. If its error were reduced by a quarter, would that be news?

Second, the Higgs mass is fundamentally different than the W mass. The W mass has a predicted value, and one can compare prediction and measurement. The Higgs does not. If one did something foolish and took the W mass as given and used it to "predict" the Higgs mass, since the Higgs dependency is only logarithmic, a sub-percent improvement makes zero difference. (It turns out its not just hard, it's impossible)

The third argument, paraphrased as "maybe it will later turn out to be useful" can equally be applied to any measurement. Why this one?

In short, it does not do what you claim it does.

I'm not saying they shouldn't have published - of course they should have. I'm questioning why is this result more interesting than the hundreds of others?
 
  • Like
Likes vanhees71
  • #13
Vanadium 50 said:
First, the electron mass is also a fundamental parameter of the standard model. If its error were reduced by a quarter, would that be news?
The more precisely we already know the value of a physical constant, the less interesting increased precision in the measurement is, although the interesting thing about a reduction in the electron mass measurement uncertainty would be that it would make it one of the most precisely determined physical quantities in all of scientific history, even if the actual value itself wasn't very newsworthy. A summary of what we know about the Standard Model constants (including neutrinos and assuming that cosmology bounds are valid) is as follows (not including the most recent update for the Higgs boson mass):

Screenshot 2023-08-29 at 3.40.23 PM.png

Note that I have omitted footnotes and caveats (for example, strictly speaking, the neutrino masses are the absolute values of the three neutrino mass eigenstates due to oscillation and cosmology data combined, and the units for Newton's G are cutoff in this screenshot).

Vanadium 50 said:
Second, the Higgs mass is fundamentally different than the W mass. The W mass has a predicted value, and one can compare prediction and measurement. The Higgs does not. If one did something foolish and took the W mass as given and used it to "predict" the Higgs mass, since the Higgs dependency is only logarithmic, a sub-percent improvement makes zero difference. (It turns out its not just hard, it's impossible)
The W boson doesn't really have a predicted value per se. The W boson, Z boson, electromagnetic coupling constant, and weak force coupling constant have a functional relationship to each other at tree level which incorporates the Higgs boson mass and top quark mass beyond tree level. The W boson is often predicted out of those related physical constants because it happens to be the least precisely measured at the moment, but it isn't intrinsically any more or less fundamental than any other other physical constants in that relationship.
Vanadium 50 said:
The third argument, paraphrased as "maybe it will later turn out to be useful" can equally be applied to any measurement. Why this one?
Because we are doing lots of experiments at the LHC to test properties that are derivative of the Higgs boson mass, while a lot of the experiments testing properties that are derivative of other physical constants have already been done.

Also, in terms of theoretical relationships between Standard Model parameters, even modest relative uncertainties in larger parameter values can lead to more uncertainty in the validity of the overall relationship than greater relative uncertainties in smaller parameter values. A little reduction in the Higgs boson mass uncertainty has a big impact on our ability to tell is a theoretical relationship between Standard Model parameters is true in many cases.

And, again, it comes down to newness. Usually, reductions in uncertainty in absolute terms are front loaded. There is still room for big reductions in the Higgs boson mass uncertainty because there have only been a few measurements of it made, while there is less room for surprises in parameters that we have been measuring for a long time.
Vanadium 50 said:
In short, it does not do what you claim it does.

I'm not saying they shouldn't have published - of course they should have. I'm questioning why is this result more interesting than the hundreds of others?
I think that pretty much all new SM parameter measurements are more interesting than the myriad measurements of heavy and exotic hadron properties like baryons with multiple heavy valence quarks, or tetraquark masses, or J/psi decay fractions, or updated parton distribution function data.
 
Last edited:
  • #14
ohwilleke said:
The more precisely we already know the value of a physical constant, the less interesting increased precision in the measurement
That argues that the H mass is more interesting than the W mass. That is certainly not what actual, practicing particle physicists think.
ohwilleke said:
The W boson doesn't really have a predicted value per se.
As my sainted grandmother used to say, "When you find yourself in a hole, stop digging."

You give me the Z mass, the weak mixing angle and the fine structure constanrt and I'll tell you the W mass at tree level. (Add the top and Higgs and i can do it at one loop) There is nothing comparable to the Higgs mass. It just...is.

ohwilleke said:
A little reduction in the Higgs boson mass uncertainty has a big impact on our ability to tell is a theoretical relationship between Standard Model parameters is true in many cases.
I am going to call you on this. Please cite three models which were allowed under the old number and disallowed under the new numbers. You said "many cases", so surely you can find three.
 
  • Like
Likes vanhees71

1. What is the Higgs Boson?

The Higgs Boson is a subatomic particle that is believed to give other particles their mass. It was first theorized in the 1960s and was finally discovered in 2012 by the Large Hadron Collider (LHC) at CERN.

2. Why is the Higgs Boson mass measurement important?

The Higgs Boson mass measurement is important because it helps us understand the fundamental forces and particles that make up our universe. It also provides evidence for the existence of the Higgs field, which is responsible for giving particles their mass.

3. How is the Higgs Boson mass measured?

The Higgs Boson mass is measured by analyzing the data collected from particle collisions at the LHC. Scientists use complex mathematical equations and statistical analysis to determine the mass of the Higgs Boson based on the particles it decays into.

4. Has the Higgs Boson mass measurement changed since its discovery in 2012?

Yes, the Higgs Boson mass measurement has changed since its discovery in 2012. Initially, it was estimated to have a mass of 125 GeV, but with more data and improved analysis techniques, the current measurement is 125.18 GeV with a margin of error of 0.16 GeV.

5. What are the implications of the latest Higgs Boson mass measurement?

The latest Higgs Boson mass measurement confirms the Standard Model of particle physics, which is our current understanding of the fundamental particles and forces in the universe. It also provides a more precise value for the Higgs Boson mass, which can be used to further refine our theories and models of the universe.

Similar threads

  • High Energy, Nuclear, Particle Physics
Replies
11
Views
1K
  • High Energy, Nuclear, Particle Physics
Replies
7
Views
2K
  • High Energy, Nuclear, Particle Physics
Replies
2
Views
1K
  • High Energy, Nuclear, Particle Physics
Replies
17
Views
5K
  • High Energy, Nuclear, Particle Physics
Replies
19
Views
2K
  • High Energy, Nuclear, Particle Physics
Replies
1
Views
1K
  • High Energy, Nuclear, Particle Physics
Replies
2
Views
2K
  • High Energy, Nuclear, Particle Physics
Replies
9
Views
3K
  • High Energy, Nuclear, Particle Physics
Replies
30
Views
2K
  • High Energy, Nuclear, Particle Physics
Replies
1
Views
6K
Back
Top