New LHC Results 2015: Interesting Diphoton Excess?

In summary, CMS and ATLAS have released their conference notes for the 2016 data and there is no evidence of a diphoton excess at the same mass range where it appeared in 2015. CMS has removed events where both photons were detected in the endcap, which was previously shown in earlier analyses. There is nothing public from ATLAS at this time. The significance of the 2015 bump increased by 2.5 sigma with more data, but with the new data it has essentially disappeared, suggesting that it was likely a statistical fluke. ATLAS will be presenting their results on Friday morning and it is rumored that they will show similar results. Some people may feel disappointed, but it
  • #1
37,127
13,968
https://www.physicsforums.com/threads/new-lhc-results-2015-tuesday-dec-15-interesting-diphoton-excess.84798 and status from Monday

CMS released their conference note a bit earlier. They see absolutely nothing at the mass range where the excess appeared in 2015.

It is a bit curious that they removed events where both photons were detected in the endcap. This was shown in earlier analyses - why not this year?

Nothing public from ATLAS so far.Summary plot, a peak in the data corresponds to a downwards spike (lower = more significant):

CpCXCqDXgAAVvNG.jpg
 
Last edited by a moderator:
  • Like
Likes ChrisVer, vanhees71, Ygggdrasil and 2 others
Physics news on Phys.org
  • #2
ATLAS will be showing their results Friday morning.
 
  • Like
Likes vanhees71
  • #3
It is worth noting that the naive expectation, if the bump had been real, would be for the 2016 data set (which is much bigger than the 2015 data set in which the 750 GeV bump was seen) to increase the significance of that resonance by about 2.5 sigma. Anything less than that increase in significance would have cast doubt on the 750 GeV bump being real.

Recall that the significance of the 2015 bump was as follows: The local significances were given as 3.9 sigma (ATLAS) and 2.6 sigma (CMS). The global significances were just 2.0 (ATLAS) and less than 1.2 (CMS) – but the excess was observed at the same place, so we cannot “look elsewhere” for both experiments separately.

The significance of the original 2015 bump at CMS increased by the 2.5 sigma that should have been expected with more data would have pushed the resonance to a local significance of 5.1 sigma or so with the new data if it was real, which would have been unmistakable. Instead, the bump is pretty much completely gone entirely from CMS, as expected if the bump in the 2015 data was almost entirely a statistical fluke.

The rumor mill claims that ATLAS will see basically the same thing, but will know for sure a little more than twelve hours from now.
 
  • Like
Likes vanhees71
  • #4
Which talk was it? Is the presentation online at Indico?
 
  • #5
vanhees71 said:
Which talk was it? Is the presentation online at Indico?
The talks are this morning (Chicago time) as I understand. This is a pre-release of the CMS conference note.

A student I co-supervise has the (ungrateful) task of giving a talk in a parallel session at the same time ...
 
  • #6
Orodruin said:
The talks are this morning (Chicago time) as I understand. This is a pre-release of the CMS conference note.

A student I co-supervise has the (ungrateful) task of giving a talk in a parallel session at the same time ...
Well, that may feel bad, but also ruling out things is important work. I remember the poor CLAS people who had to present the non-confirmation of the pentaquark...
 
  • #7
vanhees71 said:
Well, that may feel bad, but also ruling out things is important work.
You misread me, the student is giving a talk in a session parallel to the ATLAS and CMS results (it has nothing to do with the diphoton resonance), with the implication that basically nobody will go to that session. This would be true regardless of what ATLAS and CMS present.
 
  • #8
Ah, I see. Yes, that's really bad. As interesting as big conferences with parallel sessions can be, I prefer smaller workshops.
 
  • #9
Depending on the student, it could still be O.K. Some people, even professionals with elite educations, have terrible stage fright and can be a bit more relaxed knowing that fewer people are in the audience and that the only people who are there are people who are really deeply interested in what you have to say.
 
  • Like
Likes vanhees71
  • #10
If it is three people staring at their laptops it is even more depressing, but let us get back on-topic.
 
  • #11
CMS took down their conference note again (well, replaced it by a 2-page PDF with a meaningless abstract and no content).

The talks start at 16:00 CERN time, this post was posted 11:54 CERN time, so add 4 hours to whatever the forum shows for this post if you set the time zone correctly.

@ohwilleke: CMS had updated their result and got a higher significance for Moriond. The 8 TeV data indicated that the 2015 excess was on the high side even if there was a new particle.
 
  • #14
http://backreaction.blogspot.de/2016/08/the-lhc-nightmare-scenario-has-come-true.html

What do you think about Sabine Hossenfelder's article here?
Especially about this:
Now that the diphoton bump is gone, we’ve entered what has become known as the “nightmare scenario” for the LHC: The Higgs and nothing else. Many particle physicists thought of this as the worst possible outcome. It has left them without guidance, lost in a thicket of rapidly multiplying models. Without some new physics, they have nothing to work with that they haven’t already had for 50 years, no new input that can tell them in which direction to look for the ultimate goal of unification and/or quantum gravity.
I don't know, I found this declaration depressing and a little bit rushed for now... nothing is over yet and nothing is in vain. Also the comments get more depressing [for collider physics etc]
 
  • #15
The Tevatron found the top-quark nine years after it reached its maximal energy. The LHC has not even reached its design energy yet, and has been close to it only for a bit more than a year.

The data analyzed so far is not even 1% of the total planned integrated luminosity. Let's check again with the ~30-50/fb at the end of the year, with ~300/fb in ~2025, and with ~3000/fb in ~2035.
Also, various analyses didn't get updates with 2016 data yet, or did not even start with 13 TeV data.
 
  • Like
Likes ChrisVer
  • #16
I completely disagree with Sabines analysis for many reasons and I suspect most particle physicists do as well. Having said that, the nightmare scenario is a little closer than it used to be.

She casts a lot of dispersions at certain theoretical ideas, which I think are unjustified but look there was always the possibility that one day due to technical/financial limitations the methods we have used to discover and probe the high energy frontier would hit diminishing returns. That there might be a limit to human ingenuity and to experimental guidance. Fortunately I don't think we are anywhere near that point, so all of this boils down to certain hypotheticals and extrapolations concerning human behavior. In short yet another storm in a teacup!
 
  • Like
Likes vanhees71
  • #17
To be clear to non-experts, the Spin-0 and Spin-2 analyses have a very high degree of overlap: each is optimized for a particular spin hypothesis, but fundamentally they are looking at the same data. It's possible that one sees a 4 sigma excess and the other a 5 sigma excess, but it's not possible that one sees a five sigma excess and the other nothing.

There are unhappy theorists out there, but it's not like the experiments didn't warn them that the significances were weak. They didn't want to hear about trials factors or partonic luminosity ratios or anything. We can see where that line of thinking leads.
 
  • Like
Likes 1oldman2
  • #18
Vanadium 50 said:
There are unhappy theorists out there, but it's not like the experiments didn't warn them that the significances were weak. They didn't want to hear about trials factors or partonic luminosity ratios or anything. We can see where that line of thinking leads.
Just to be clear, there are also many theorists who did not jump to premature conclusions and rode the citation wave and actually took the hint for what it was.
 
  • Like
Likes vanhees71
  • #19
If anything, spending 9 months working on BSM theories for a deviation of global significance of 2.0 is questionable.

There are many outstanding and difficult problems which need further assessment in theoretical physics.

(Edit: the 750 GeV was never, and never will be one of these)
 
  • #20
mfb said:
The Tevatron found the top-quark nine years after it reached its maximal energy. The LHC has not even reached its design energy yet, and has been close to it only for a bit more than a year.

The data analyzed so far is not even 1% of the total planned integrated luminosity. Let's check again with the ~30-50/fb at the end of the year, with ~300/fb in ~2025, and with ~3000/fb in ~2035.
Also, various analyses didn't get updates with 2016 data yet, or did not even start with 13 TeV data.
I was actually wondering about this. What was it that allowed them to see the top quark after 9 years when they hadn't before? Just more data to get 5 sigma? Or new analysis/detectors?
 
  • #21
The first years were spent building up the proper detectors (weird timeline, but here it is), 1992 they started run 1. Then it was mainly the amount of data collected. The analyses improved over time as well, but more towards the conservative side. Several analysis techniques that are now standard were developed there.
 
  • #22
The NYT article here on the LHC null result of diphoton excess mentions that: "The Higgs, one of the heaviest elementary particles known, weighs about 125 billion electron volts. That, however, is way too light by a factor of trillions according to standard quantum calculations, unless there is some new phenomenon, some new physics, exerting its influence on the universe and keeping the Higgs mass from zooming to cataclysmic scales. That would mean new particles."

Can anyone elaborate on the suppression of the Higgs mass to the observed value?
 
  • #23
Well, the Higgs mass is one of the fundamental parameters that have to be determined by experiment within the Standard Model. In 2012 it has been found to be about 125 GeV, and that's why we know this value now. There's no deeper reason for it, i.e., no theory explaining the value from some (symmetry) principle. That would be a new model describing physics beyond the standard model. There are many ideas around, but so far no evidence for such physics beyond the Standard Model (except neutrino masses and mixing).
 
  • #24
Ranku said:
The NYT article here on the LHC null result of diphoton excess mentions that: "The Higgs, one of the heaviest elementary particles known, weighs about 125 billion electron volts. That, however, is way too light by a factor of trillions according to standard quantum calculations, unless there is some new phenomenon, some new physics, exerting its influence on the universe and keeping the Higgs mass from zooming to cataclysmic scales. That would mean new particles."

Can anyone elaborate on the suppression of the Higgs mass to the observed value?
There is a fine-tuning element that appears odd: You would expect the Higgs mass to be the sum of its bare mass and corrections from other particles. Those corrections from other particles are naturally at the scale of new physics - at the Planck scale if nothing comes earlier. That means the bare mass has to be of the order of the Planck scale as well, and the sum of the two is then 125 GeV - 17 orders of magnitude below the Planck scale. That would be a remarkable coincidence: two unrelated numbers agreeing with each other to 17 significant figures.

There are four main alternatives:
- new particles not too much heavier than the Higgs, most notably supersymmetry
- the Higgs is not elementary (e. g. technicolor)
- something like the relaxion, the Higgs mass starts at the Planck scale and gets lower until it reaches the scale of electroweak symmetry breaking
- we misunderstand the problem in some fundamental way
 
  • Like
Likes vanhees71
  • #25
I was scanning through the ATLAS presentation here
I am sorry if it's a stupid question but I don't quiet see/understand why the following selection [sl3]:
[itex] \frac{p_{T1}}{m_{\gamma \gamma}} > 0.4 ~~,~~ \frac{p_{T2}}{m_{\gamma \gamma}} > 0.3[/itex]
suppresses the small scattering angles (or large delta thetas between the gammas). Or if the goal is to suppress that angle, why not using the delta theta between photons but going on to imposing a pT-over-reconstructed-invariant-mass ratio cut?
 
  • #26
You have many background events where the photons go in the forward directions. There are different approaches to keep that out of the analysis, I guess both experiments tested all of them (on MC).

- a cut on the pseudorapidity difference would probably work
- relative pT cuts, as ATLAS uses them
- remove events where both photons hit the endcap, as CMS does it

A potential problem with the first approach: The difference in pseudorapidity goes into the invariant mass calculation. Cutting on that can distort the spectrum.For the spin 2 analysis, the photons tend to go more into the endcaps, and all the methods described above remove some signal. Therefore, ATLAS has a different selection for this analysis, using only absolute pT cuts. As far as I understand CMS doesn't have a jet background rejection good enough to consider endcap/endcap events, at least they didn't show those publicly so far.
 
  • #27
mfb said:
There is a fine-tuning element that appears odd: You would expect the Higgs mass to be the sum of its bare mass and corrections from other particles. Those corrections from other particles are naturally at the scale of new physics - at the Planck scale if nothing comes earlier. That means the bare mass has to be of the order of the Planck scale as well, and the sum of the two is then 125 GeV - 17 orders of magnitude below the Planck scale. That would be a remarkable coincidence: two unrelated numbers agreeing with each other to 17 significant figures.

There are four main alternatives:
- new particles not too much heavier than the Higgs, most notably supersymmetry
- the Higgs is not elementary (e. g. technicolor)
- something like the relaxion, the Higgs mass starts at the Planck scale and gets lower until it reaches the scale of electroweak symmetry breaking
- we misunderstand the problem in some fundamental way

The last of those options is by far the most plausible. Experimental exclusions of supersymmetry grow stronger by the day. There is no experimental evidence at all points to technicolor type solutions. And, the relaxion similarly has nothing affirmative to support it.
 
  • #28
Thanks for the replies.
 
  • #29
mfb said:
There are four main alternatives:
- new particles not too much heavier than the Higgs, most notably supersymmetry
- the Higgs is not elementary (e. g. technicolor)
- something like the relaxion, the Higgs mass starts at the Planck scale and gets lower until it reaches the scale of electroweak symmetry breaking
- we misunderstand the problem in some fundamental way

Sorry if this is a stupid question, but:
If the Higgs generates mass, then wouldn't its own mass be an operator, not a constant parameter?
 
  • #30
The Higgs field is related to giving particles mass. The Higgs mechanism gives all massive particles some fixed mass, including the Higgs boson.ATLAS and CMS now have datasets 3-4 times the size of what has been shown at ICHEP. We should get new results in the near future, the latest at Moriond (March).
 
  • Like
Likes vanhees71
  • #31
mfb said:
. We should get new results in the near future, the latest at Moriond
I think the aim is at Moriond (not before)?
 
  • #32
Didn't see announcements for anything before that, but that doesn't mean there is no chance things get presented earlier.
 
  • #33
Is there other physics run next year?
 
  • #34
arivero said:
Is there other physics run next year?
Yes... LHC will undergo some short shutdown for half the 2017 and then continue taking data.almost for the whole time till the end of 2018 when LS2 is scheduled. I think before LS2 the plan is to reach above 100/fb of data.
 

1. What is the "diphoton excess" observed in the new LHC results?

The diphoton excess refers to an unexpected excess of events involving the production of two photons (particles of light) at the Large Hadron Collider (LHC) at CERN. This excess was observed in the data collected by the ATLAS and CMS experiments in 2015.

2. What is the significance of the diphoton excess in relation to the Standard Model of particle physics?

The diphoton excess is significant because it does not fit within the predictions of the Standard Model, the current theory that describes the fundamental particles and their interactions. This suggests the possibility of new, previously unknown particles or interactions that could help explain this excess.

3. How was the diphoton excess detected and measured?

The diphoton excess was detected by analyzing the data collected by the ATLAS and CMS experiments at the LHC. The excess was measured by comparing the number of observed events involving the production of two photons to the number of events predicted by the Standard Model.

4. What are some potential explanations for the diphoton excess?

There are several theories that could potentially explain the diphoton excess, including the existence of a new particle, such as a heavy Higgs boson or a graviton, or the presence of new interactions between known particles. However, further data and analysis are needed to confirm or rule out these explanations.

5. What are the implications of the diphoton excess for future research at the LHC?

The diphoton excess has generated a lot of excitement and interest in the scientific community, as it could potentially lead to new discoveries and advancements in our understanding of the fundamental building blocks of the universe. Further research and analysis at the LHC will be crucial in determining the true nature and implications of this excess.

Similar threads

  • High Energy, Nuclear, Particle Physics
Replies
17
Views
5K
  • High Energy, Nuclear, Particle Physics
2
Replies
45
Views
9K
  • High Energy, Nuclear, Particle Physics
2
Replies
57
Views
13K
  • High Energy, Nuclear, Particle Physics
Replies
26
Views
5K
  • High Energy, Nuclear, Particle Physics
2
Replies
69
Views
12K
  • High Energy, Nuclear, Particle Physics
2
Replies
49
Views
9K
  • General Discussion
Replies
6
Views
1K
  • High Energy, Nuclear, Particle Physics
Replies
11
Views
3K
  • Beyond the Standard Models
Replies
18
Views
3K
  • Beyond the Standard Models
Replies
30
Views
7K
Back
Top