What are the implications of a new 750GEV particle to GUTs?

  • A
  • Thread starter kodama
  • Start date
  • Tags
    Particle
In summary: So it's really hard to say anything definitive.In summary, a new 750 GeV particle has been discovered and it is unknown if it is compatible with currently researched GUTs. If the particle is confirmed, it would have implications for the Higgs hierarchy problem and for theories of Supersymmetry and Technicolor. It is too early to say anything definitive about the particle or its implications.
  • #1
kodama
976
132
What are the implications of a new 750GEV particle to current GUTs?

in the next couple of months, hopefully the LHC will either confirm or not, a new previously unknown particle at 750GEV based on a diphoton excess.

can currently researched GUTs like SU(5), SO(10) etc accommodate a new particle, spin 0 or spin 2, of mass 750 GEV?

and if the LHC does confirm a new 750GEV, how would it effect the Higgs hierarchy problem?

is there a fine-tuning problem for the Higgs with the scale of new physics occurring at 750GEV?

how would the 750GEV particle affect Higgs hierarchy proposed solutions like SUSY and Technicolor?
 
  • #3
kodama said:
What are the implications of a new 750GEV particle to current GUTs?

in the next couple of months, hopefully the LHC will either confirm or not, a new previously unknown particle at 750GEV based on a diphoton excess.

can currently researched GUTs like SU(5), SO(10) etc accommodate a new particle, spin 0 or spin 2, of mass 750 GEV?

and if the LHC does confirm a new 750GEV, how would it effect the Higgs hierarchy problem?

is there a fine-tuning problem for the Higgs with the scale of new physics occurring at 750GEV?

how would the 750GEV particle affect Higgs hierarchy proposed solutions like SUSY and Technicolor?

A new 750 GeV particle would almost certainly imply multiple other particles to go with it in a class of particles that do not currently exist. Most GUTs are formulated with the idea that they should predict few particles other than those of the Standard Model, so many won't have an obvious place to fit this although heroic efforts will be made to do so. Now obviously, you can formulate a GUT with arbitrarily many particles and forces as you can imagine (and this 750 GeV particle would probably imply the need for another Standard Model gauge coupling force as well), so I'm sure someone could come up with some way to fit it into a GUT scheme, but it would probably be one of the less parsimonious models.

A number of alternative gravity theories feature both a massive and a massless graviton, but generally, the massive graviton in those theories is much lighter and much longer lived.

Unless it contributes to the Higgs field and acquires mass by the mechanism it wouldn't impact the Higgs hierarchy problem at all, and we certainly don't know if this particle acquires mass by this means as opposed, for example, to being a composite particle of some sort that acquires most of its mass via the binding force that holds it together the way that hadrons do. The fact that the vacuum expectation value of the Higgs field is equal to the sum of the square of all of the Standard Model fundamental particle masses (almost exactly) hints at least that this particle's mass does not derive from the Higgs field or Higgs physics - and its properties have little or nothing in common with a hypothetical heavy Higgs boson from two or more Higgs doublet models. A heavy Higgs ought to decay differently than this apparently particle appears to. And, so it isn't a good fit to the hypothetical A or H particles of SUSY theories with multiple Higgs doublets.

A neutral electric charge both implies the need for intermediate charged particles at some point in the decay/production chain (since photons aren't produced by particles without electric charge making diphoton decays impossible), but the pool of possible unstable spin-0 SUSY particles with neutral electric charge is not terribly large. If it is produced via gluon fusion as some theorists suggest, it would also need to have strong force color charge, which sleptons and squarks in SUSY theories both lack. SUSY gluinos are fermions while this, if it exists, if a boson. Bottom line - you need a pretty elaborate SUSY model to be a good fit.

At least two preprints address the possibility that this resonance is a techni-pion. http://arxiv.org/abs/1604.02037 and http://arxiv.org/abs/1512.05564
 
Last edited:
  • Like
Likes eloheim
  • #4
thanks for replying.

i've heard that the 750 while it rules out MSSM does not rule out NMSSM

if the 750 is a heavier higgs, how would that address the fine tuning problem?

since they found a higgs at 126 is technicolor still viable? since the higgs is what breaks ew-symmetry is there a well motivated reason to continue to pursue technicolor?

regards
 
  • #5
It's very much too early to say anything definitive. We don't know enough about the purported particles properties to really get into deep theoretical issues. Certainly *if* this turns out to be a real physical effect (and not some sort of systematic error), the devil will be in the details. The couplings, what channels it participates in, and so forth.

Clearly every model with the word "minimal" in it, will need to be revised. This isn't the first time new physics has come out of left field and surprised everyone, the discovery of the muon was exactly like that.

Many of the assumptions and arguments going into words like naturalness, GUTs and so forth will need to be rethought in light of what could be an entirely new landscape. I know this isn't satisfying, but unfortunately model building this beast is anything but simple. It really does suggest complicated new physics if it turns out to be true.

Regarding technicolor, the original version of technicolor was ruled out long ago, and most of the remaining parameter space of the *simplest* versions have now been ruled out with the recent discovery of the Higgs.
 
  • Like
Likes kodama
  • #6
ohwilleke said:
A number of alternative gravity theories feature both a massive and a massless graviton, but generally, the massive graviton in those theories is much lighter and much longer lived.
If it is a particle, we don't know its lifetime. The best ATLAS fit corresponds to some width but the narrow width fit works as well, and in CMS data a narrow width is favored. That gives some minimal lifetime - it cannot be too wide. A maximal lifetime is much more challenging, you cannot use the decay width and with the photons you don't even have precise tracking. It could fly a few millimeters before the calorimeter photon pointing would notice it. At nonrelativistic speeds (small pT of the photon pairs), a few millimeters are tens of picoseconds.

Edit:
Haelfix said:
Certainly *if* this turns out to be a real physical effect (and not some sort of systematic error)
A systematic error is the most unlikely result. A statistical fluctuation: possible. But there is no way to get a peak in the diphoton spectrum by doing something wrong.
 
Last edited:
  • Like
Likes atyy and kodama
  • #7
what about SUSY ? its my understanding MSSM is ruled out if this particle is real.

how soon can LHC confirm at current data collection?
 
  • #8
Optimistic: ~2-3/fb for analyses shown at ICHEP (3rd to 10th of August). That is smaller than the 2015 dataset (~4/fb), but if the excess is real at the 2015 signal strength, both experiments should have something like 2.5-3.5 sigma local significance in 2016 data alone, and people will get really excited. Combinations might even reach 5 sigma local significance per experiment, combining both experiment could give that as global significance. Such a combination is unlikely, however, as the experiments will focus on analyzing larger datasets.
The full 2016 dataset, currently expected to be 20/fb, will then confirm the existence beyond reasonable doubt and will allow a reasonable cross section measurement.

Pessimistic for LHC performance but with particle: 1 to 1.5/fb shown at ICHEP, ~2 sigma if the excess is real and the signal strength is as in 2015. Very interesting, but probably not sufficient to celebrate. A few weeks more should be a few 1/fb more, sufficient to get high significances.

Optimistic for LHC performance but without particle: No excess in 2-3/fb, combination with 2015 still has some significance, but interest will drop rapidly.

Pessimistic: 1 to 1.5/fb, no excess, but the dataset is too small to draw any conclusion. Interest will still drop a bit.
 
  • Like
Likes eloheim and kodama
  • #9
thanks, any specific dates when cern lhc will announce the data results analysis from the new run 2016 starting now, that i can mark on my calender? thanks in advance
 
  • #10
Probably at ICHEP, 3rd to 10th of August. There is no detailed schedule available yet for the conference, but typically they try to move important things to the front so talks in the parallel sessions can show details.

I would expect an update on the diphoton mass spectrum later in the year, and an analysis of the full dataset ~1.5-2 months after data taking ends (~1st of November), probably before Christmas like last year.

All those things are just guesses based on typical timescales of analyses and previous publication schedules.
 
  • Like
Likes kodama
  • #11
how would this impact SUSY? 750 gev spin 0 is not a SUSy of any known SM partner, and thus far gluinos have not shown up
 
  • #12
mfb said:
Edit:A systematic error is the most unlikely result. A statistical fluctuation: possible. But there is no way to get a peak in the diphoton spectrum by doing something wrong.

Yea I've seen it argued several ways now, and with all due respect I'm skeptical. The diphoton channel is indeed very clean theoretically, so I don't think it is a misunderstood background attribution error, and the systematic checks from the CMS and Atlas groups looks solid, nevertheless large statistical fluctuations of this size (even with the trial factor) in both detectors don't happen every day. There was also some tension with run 1, so yea I'm not entirely sold on the clean systematic hypothesis quite yet.
 
  • Like
Likes kodama
  • #13
ohwilleke said:
A new 750 GeV particle would almost certainly imply multiple other particles to go with it in a class of particles that do not currently exist. Most GUTs are formulated with the idea that they should predict few particles other than those of the Standard Model, so many won't have an obvious place to fit this although heroic efforts will be made to do so. Now obviously, you can formulate a GUT with arbitrarily many particles and forces as you can imagine (and this 750 GeV particle would probably imply the need for another Standard Model gauge coupling force as well), so I'm sure someone could come up with some way to fit it into a GUT scheme, but it would probably be one of the less parsimonious models.

A number of alternative gravity theories feature both a massive and a massless graviton, but generally, the massive graviton in those theories is much lighter and much longer lived.

Unless it contributes to the Higgs field and acquires mass by the mechanism it wouldn't impact the Higgs hierarchy problem at all, and we certainly don't know if this particle acquires mass by this means as opposed, for example, to being a composite particle of some sort that acquires most of its mass via the binding force that holds it together the way that hadrons do. The fact that the vacuum expectation value of the Higgs field is equal to the sum of the square of all of the Standard Model fundamental particle masses (almost exactly) hints at least that this particle's mass does not derive from the Higgs field or Higgs physics - and its properties have little or nothing in common with a hypothetical heavy Higgs boson from two or more Higgs doublet models. A heavy Higgs ought to decay differently than this apparently particle appears to. And, so it isn't a good fit to the hypothetical A or H particles of SUSY theories with multiple Higgs doublets.

A neutral electric charge both implies the need for intermediate charged particles at some point in the decay/production chain (since photons aren't produced by particles without electric charge making diphoton decays impossible), but the pool of possible unstable spin-0 SUSY particles with neutral electric charge is not terribly large. If it is produced via gluon fusion as some theorists suggest, it would also need to have strong force color charge, which sleptons and squarks in SUSY theories both lack. SUSY gluinos are fermions while this, if it exists, if a boson. Bottom line - you need a pretty elaborate SUSY model to be a good fit.

At least two preprints address the possibility that this resonance is a techni-pion. http://arxiv.org/abs/1604.02037 and http://arxiv.org/abs/1512.05564

How would this particle imply other particles? If its a new particle that the standard model didn't predict, then how could you possibly know it implies anything at all? Why couldn't it just be one particle on its own?
 
  • #14
What's the probability that this particle isn't a statistical fluke? Are there good odds that this is the real deal?
 
  • #15
Haelfix said:
Yea I've seen it argued several ways now, and with all due respect I'm skeptical. The diphoton channel is indeed very clean theoretically, so I don't think it is a misunderstood background attribution error, and the systematic checks from the CMS and Atlas groups looks solid, nevertheless large statistical fluctuations of this size (even with the trial factor) in both detectors don't happen every day. There was also some tension with run 1, so yea I'm not entirely sold on the clean systematic hypothesis quite yet.
Systematic effects that large, in such a clean channel, would be something completely new I think. A statistical fluctuation like that can happen. Remember the fluctuation around 140(?) GeV in the Higgs search? It was not as significant, but still interesting and both ATLAS and CMS had it. Went away with more data. LHCb had a 3.5 sigma deviation in delta A_CP, which is a single measurement value so there is no local significance. Went away with more data.
serp777 said:
How would this particle imply other particles? If its a new particle that the standard model didn't predict, then how could you possibly know it implies anything at all? Why couldn't it just be one particle on its own?
If it is a new particle, it is uncharged, but there has to be some interaction that leads to photons - it has to be another particle in a loop. If that particle is a standard model particle, then the 750 GeV particle would be heavy enough to directly decay to a pair of that particles - much more frequent than the diphoton decay. We would have seen that already. We also have various other constraints - it cannot influence the anomalous magnetic moment of electrons or muons too much, for example.
serp777 said:
What's the probability that this particle isn't a statistical fluke? Are there good odds that this is the real deal?
Depends on who you ask, but I think most particle physicists see the probability for an actual particle somewhere between 1% and 20%.
 
  • #16
mfb said:
If it is a particle, we don't know its lifetime. The best ATLAS fit corresponds to some width but the narrow width fit works as well, and in CMS data a narrow width is favored. That gives some minimal lifetime - it cannot be too wide. A maximal lifetime is much more challenging, you cannot use the decay width and with the photons you don't even have precise tracking. It could fly a few millimeters before the calorimeter photon pointing would notice it. At nonrelativistic speeds (small pT of the photon pairs), a few millimeters are tens of picoseconds.

Edit:A systematic error is the most unlikely result. A statistical fluctuation: possible. But there is no way to get a peak in the diphoton spectrum by doing something wrong.

There is absolutely uncertainty in the width of a potential 750 GeV mass resonance. But, any proposed massive graviton would have to have a lifetime sufficient to travel at a minimum from galaxies to their satellite galaxies at speeds of a something slightly less than the speed of light. Certainly, mass gravitons would be expected to have mean lifetimes longer than the mean lifetime of a free neutron (i.e. longer than 14 minutes before adjusting for special relativity), which would mean that any of its decays would happen outside the range of the detectors at the LHC. Any particle that decays fast enough for its decays to be detected by a detector at the LHC has a mean lifetime too short to be a massive graviton.

Also, I disagree with you on the issue of systemic error. There are an infinite number of ways that systemic error can arise, and some of them could absolutely produce a peak in the diphoton spectrum. To cite just one example, there are probably half a dozen ways that some serious but overlooked screwup in the computer code that takes the raw data from the LHC and converts it into output used by scientists at the LHC could create such a result. Some systemic errors are just way out there - consider the loose electrical connection in the Opera superluminal neutrino debacle. You probably don't get a peak in the diphoton spectrum from the usual systemic errors like calibration and imprecision issues, but the fact that you can rule out the "usual suspects" for causes of systemic error doesn't mean that you can rule out the completely unexpected ones. Murphy's Law is a powerful thing and unlike statistical fluctuations, is much less subject to being quantified in an accurate way.
 
  • #17
kodama said:
what about SUSY ? its my understanding MSSM is ruled out if this particle is real.

how soon can LHC confirm at current data collection?

The MSSM has been pretty much ruled out already with or without this particle. The MSSM continues to be used for a benchmark for much the same reasons that we still use Newtonian gravity for many purposes. Even though we know that both are not accurate depictions of Nature, they are accurate enough to use as toy models that simplify reality in respects that often aren't important for the purposes we want to use them for.

Newtonian gravity is adequate for lots of dark matter modeling in many body problems at the galactic level. The MSSM is sufficient to model the properties that the lightest supersymmetric particles ought to have if any mainstream less minimal form of SUSY is real when setting up computer models to distinguish potential SUSY events from clearly non-SUSY events at a collider.
 
Last edited:
  • Like
Likes kodama
  • #18
serp777 said:
How would this particle imply other particles? If its a new particle that the standard model didn't predict, then how could you possibly know it implies anything at all? Why couldn't it just be one particle on its own?

One of the easier ways to know that this can't be one particle on its own is that it is overall electrically neutral, but it is decaying in a diphoton mode. An electrically neutral fundamental particle, by definition, cannot decay to or couple with photons (e.g. a neutrino and an anti-neutrino cannot directly annihilate each other into photons). Electric charge is defined as propensity to couple to photons. So, the particle would have to decay to a charged particle and a charged antiparticle which in turn could give rise to a diphoton decay path (or it could be a composite particle with charged components). But, there are no Standard Model intermediate particles that are obvious fits because we'd see additional channels of decay from Standard Model charged particles with 375 GeV of mass-energy. This isn't the only reason that there need to be other particles, but it is the most straightforward to explain.
 
  • #19
mfb said:
Depends on who you ask, but I think most particle physicists see the probability for an actual particle somewhere between 1% and 20%.

Lubos Motl, one of the most "optimistic" physics bloggers out there is on record saying there is a 50% likelihood that it is real. I call that irrational exuberance and tend to concur with mfb that 1% to 20% is closer to the mark.
 
  • #20
i've not heard MSSM is ruled out, is it bc of not finding gluinos below 1.5 TEV range? so are researchers now into nMSSM?

just as gluons w z photons are all spin 1 could you have a spin-2 particle but not be a graviton,
 
  • #21
If it does turn out to be a real particle then it doesn't fit with the standard model, and doesn't fit either with any expectations of super symmetry (afaik)
Nailing this thing down, even if it turns out to be an unexpected system or software glitch, is certainly going to be interesting.
 
  • #22
rootone said:
If it does turn out to be a real particle then it doesn't fit with the standard model, and doesn't fit either with any expectations of super symmetry (afaik)
Nailing this thing down, even if it turns out to be an unexpected system or software glitch, is certainly going to be interesting.
there are papers that suggest

The 750 GeV Diphoton Excess as a First Light on Supersymmetry Breaking
J.A. Casas, J.R. Espinosa, J.M. Moreno
(Submitted on 24 Dec 2015 (v1), last revised 16 Feb 2016 (this version, v2))
One of the most exciting explanations advanced for the recent diphoton excess found by ATLAS and CMS is in terms of sgoldstino decays: a signal of low-energy supersymmetry-breaking scenarios. The sgoldstino, a scalar, couples directly to gluons and photons, with strength related to gaugino masses, that can be of the right magnitude to explain the excess. However, fitting the suggested resonance width, Gamma ~ 45 GeV, is not so easy. In this paper we explore efficient possibilities to enhance the sgoldstino width, via the decay into two Higgses, two Higgsinos and through mixing between the sgoldstino and the Higgs boson. In addition, we present an alternative and more efficient mechanism to generate a mass splitting between the scalar and pseudoscalar components of the sgoldstino, which has been suggested as an interesting alternative explanation to the apparent width of the resonance.
Comments: 14 pages, 3 figures
Subjects: High Energy Physics - Phenomenology (hep-ph)
Cite as: arXiv:1512.07895 [hep-ph]A SUSY Inspired Simplified Model for the 750 GeV Diphoton Excess
E. Gabrielli, K. Kannike, B. Mele, M. Raidal, C. Spethmann, H. Veermäe
(Submitted on 18 Dec 2015)
The evidence for a new singlet scalar particle from the 750 GeV diphoton excess, and the absence of any other signal of new physics at the LHC so far, suggest the existence of new coloured scalars. To study this possibility, we propose a supersymmetry inspired simplified model, extending the Standard Model with a singlet scalar and with heavy scalar fields carrying both colour and electric charges -- the `squarks'. To allow the latter to decay, and to generate the dark matter of the Universe, we also add a neutral fermion to the particle content. We show that this model provides a two-parameter fit to the observed diphoton excess consistently with cosmology, while the allowed parameter space is bounded by the consistency of the model. In the context of our simplified model this implies the existence of other supersymmetric particles accessible at the LHC, rendering this scenario falsifiable. If this excess persists, it will imply a paradigm shift in assessing supersymmetry breaking and the role of scalars in low scale physics.
Comments: 7 pages, 2 figures, SUSY incarnate
Subjects: High Energy Physics - Phenomenology (hep-ph)
DOI: http://arxiv.org/ct?url=http%3A%2F%2Fdx.doi.org%2F10%252E1016%2Fj%252Ephysletb%252E2016%252E02%252E069&v=6c0beeac
Cite as: arXiv:1512.05961 [hep-ph]
ergy Physics - Phenomenology
Supersoft SUSY Models and the 750 GeV Diphoton Excess, Beyond Effective Operators

Linda M. Carpenter, Russell Colburn, Jessica Goodman
(Submitted on 18 Dec 2015 (v1), last revised 16 Mar 2016 (this version, v3))
We propose that the sbino, the scalar partner of a Dirac bino can explain the 750 GeV diphoton excess observed by ATLAS and CMS . We analyze a model in which the sbino couples to pairs of Standard Model (SM) gauge bosons. We analyze an effective operator model, as well as a completion in which the sbino couples to pairs of gauge bosons through loops of heavy sfermions. We find that the sbino may be given an appreciable decay width through tree level coupling in the Higgs or Higgsino sector and additionally fit the 750 GeV excess by considering gluon fusion processes with decay to diphotons.
Comments: 9 pages, 5 figures, References corrected
Subjects: High Energy Physics - Phenomenology (hep-ph)
Cite as: arXiv:1512.06107 [hep-ph]
750 GeV diphotons: implications for supersymmetric unification
First online:
03 March 2016
Received:
28 December 2015
Revised:
13 February 2016
Accepted:
22 February 2016
DOI: 10.1007/JHEP03(2016)017

Abstract
A recent signal of 750 GeV diphotons at the LHC can be explained within the framework of supersymmetric unification by the introduction of vector quarks and leptons with Yukawa couplings to a singlet S that describes the 750 GeV resonance. We study the most general set of theories that allow successful gauge coupling unification, and find that these Yukawa couplings are severely constrained by renormalization group behavior: they are independent of ultraviolet physics and flow to values at the TeV scale that we calculate precisely. As a consequence the vector quarks and leptons must be light; typically in the region of 375 GeV to 700 GeV, and in certain cases up to 1 TeV. The 750 GeV resonance may have a width less than the experimental resolution; alternatively, with the mass splitting between scalar and pseudoscalar components of S arising from one-loop diagrams involving vector fermions, we compute an apparent width of 10s of GeV.these papers have plenty of citations but I'm wondering how solid they are, and whether the LHC can confirm them.

are these papers reasonable or highly contrived, and is the picture of SUSY some version of nMSSM or something more complicated?
 
Last edited by a moderator:
  • #23
ohwilleke said:
Any particle that decays fast enough for its decays to be detected by a detector at the LHC has a mean lifetime too short to be a massive graviton.
There are various graviton models studied in particle physics, and they all predict lifetimes way below a nanosecond for the heavy versions. I'm not aware of models that predict lifetimes of minutes or more (could be bias from particle physics - if they don't decay they are much harder to find in accelerators). And why should it be able to travel to satellite galaxies?
ohwilleke said:
To cite just one example, there are probably half a dozen ways that some serious but overlooked screwup in the computer code that takes the raw data from the LHC and converts it into output used by scientists at the LHC could create such a result.
In two independent experiments with completely independent code at the same time? Also, that is easy to check, by looking at the event times. And every bug short of someone writing "if (diphoton.mass()<740 && diphoton.mass()<760)" doesn't produce a peak.
kodama said:
these papers have plenty of citations but I'm wondering how solid they are, and whether the LHC can confirm them.
Looking for spin and other decay channels (finding them or setting upper limits) would be the most important checks that can be done with 2016 data.
 
  • Like
Likes kodama
  • #24
mfb said:
There are various graviton models studied in particle physics, and they all predict lifetimes way below a nanosecond for the heavy versions. I'm not aware of models that predict lifetimes of minutes or more (could be bias from particle physics - if they don't decay they are much harder to find in accelerators). And why should it be able to travel to satellite galaxies?

Most of the work on massive gravitons is in GR theory and the main reason that it is explored is to provide insight into things like dark energy, dark matter and inflation. The dark energy and dark matter applications require a long lived massive graviton (although not necessarily even a meta stable one that lives on average as long as the universe is old), while inflation applications sometimes do and sometimes don't. I haven't seen much world on massive gravitons in particle physics, other than exclusions derived from missing traverse energy.

In two independent experiments with completely independent code at the same time? Also, that is easy to check, by looking at the event times. And every bug short of someone writing "if (diphoton.mass()<740 && diphoton.mass()<760)" doesn't produce a peak.

Coding errors are rarely independent. Just as in a mathematics multiple choice question, people tend to make the same wrong answers due to the same kind of fuzzy thinking, people tend to make the same kind of errors in coding. I could imagine, for example, a coding error that tends to throw rounding error into high energy bins proportionately to actual effect size being duplicated that takes a pair of 750 GeVish bin events that are slightly unlikely and amplifies them. Again, that particular example is probably not it, nor are any of a dozen others I could come up with. But, coding errors aren't as independent as you would think, and they could produce this kind of result. I could also imagine, for example, transpositions in some reference number chart in favor of some number that would seem significant to a coder or has a common source for a data chart (e.g. an error in the same pdf frequency or a particle data group determined branching constant).

Looking for spin and other decay channels (finding them or setting upper limits) would be the most important checks that can be done with 2016 data.

Certainly good places to look.
 
  • #25
ohwilleke said:
Most of the work on massive gravitons is in GR theory and the main reason that it is explored is to provide insight into things like dark energy, dark matter and inflation. The dark energy and dark matter applications require a long lived massive graviton (although not necessarily even a meta stable one that lives on average as long as the universe is old), while inflation applications sometimes do and sometimes don't. I haven't seen much world on massive gravitons in particle physics, other than exclusions derived from missing traverse energy.
The main model for the diphoton mass peak searches is a short-lived RS graviton. Short-lived as in O(GeV) decay width.
ohwilleke said:
Coding errors are rarely independent. Just as in a mathematics multiple choice question, people tend to make the same wrong answers due to the same kind of fuzzy thinking, people tend to make the same kind of errors in coding. I could imagine, for example, a coding error that tends to throw rounding error into high energy bins proportionately to actual effect size being duplicated that takes a pair of 750 GeVish bin events that are slightly unlikely and amplifies them.
I don't understand that sentence, but the analyses are unbinned, so arguments based on bins don't work.
The same rounding errors in two different collaborations? The collaborations can't even decide to use either MeV or GeV everywhere instead of a mix of both within the collaborations.
ohwilleke said:
I could also imagine, for example, transpositions in some reference number chart in favor of some number that would seem significant to a coder or has a common source for a data chart (e.g. an error in the same pdf frequency or a particle data group determined branching constant).
There is no such value that would enter the analysis.

Could you please check the analyses before you make up claims that have no connection to them?
 
  • #26
I suppose you could get a situation where a commonly used library of high precision math routines which we thought to be infallible contains a bug which only occurs rarely and is triggered depending on a specific parameter value.in the input data
 
  • Like
Likes ohwilleke
  • #27
I'm interested in an explanation how this would lead to more events around 750 GeV. There is no common specific parameter value in the input data, all the events are different.

The individual photon parameters and the calculation of the invariant mass out of them have been checked, of course. More than once, I guess. But even if you get that wrong I don't see how it would lead to a peak.
 
  • #28
Ah, so the peak is evident in the plain raw data before any processing has been applied?
 
  • #29
Plain raw data is a set of 0 and 1.

The peak is visible in the dataset with selection criteria that do not depend on the diphoton mass, but only on the individual photons.
 
  • #30
mfb said:
Could you please check the analyses before you make up claims that have no connection to them?

The devil is always in the details. I don't claim to know the details at a level sufficient to identify a particular flaw that is actually the cause of systemic error if that is the cause. I have far more familiarity, however, with examples of deeply screwed up conclusions that sound plausible.

For example, one of the most damaging economic papers in the history of the world that was used to make policies that substantially contributed to the financial crisis, emerged because there was an almost invisible error in an Xcel spreadsheet that was used to reach the conclusions reported in the paper which had the correct formula been used, would have reached the opposite policy recommendation.

Coding errors are a bit like brain damage. Tiny flaws can produce mysterious totally weird consequences that are almost impossible to predict in advance and challenging to work out after the fact because the underlying system is so complex that nobody really understands it completely. This profile of what tiny coding errors could look like corresponds pretty closely to the kind of anomaly we see with the 750 GeV bump if it is due to systemic error and is not real.

Moreover, LHC investigators are very good at analyzing statistical issues in an experiment. They do it every single time. They have a protocol and a process for doing it. They have a lot of highly analogous computations to draw upon. So, the amount of statistical error is very likely to be correct.

But, systemic error estimation is far more ad hoc, the process is far more loose, and the analogous computations in prior studies aren't as perfectly analogous.

Statistical error estimates only involve known unknowns, but systemic error estimation always involve at least some unknown unknowns which are estimated in a fairly ad hoc manner. So, when statistical error probabilities start getting low, and the results you are getting are not well motivated theoretically, a focus on potential sources of systemic error is a natural and sensible choice. But, an ability to make an educated guess about the kind of error involved doesn't imply a corresponding ability to know specifically what that error is or how it go there which is one of those things that is completely mysterious until you know the answer after which it becomes totally obvious (see Superluminal neutrinos at Opera). I'll bet that almost none of even the most savvy armchair physics analysts predicted that it was due to loose cable connections. The stuff that experts aren't thinking about first and that involve little or no analysis, are usually the cause of otherwise totally inexplicable anomalous results.
 
  • #31
ohwilleke said:
This profile of what tiny coding errors could look like corresponds pretty closely to the kind of anomaly we see with the 750 GeV bump if it is due to systemic error and is not real.
I don't see any evidence backing up that claim. Any example in the history of particle physics? To have a realistic chance that two of them appear independently in the same way, history would have to be full of those things. I don't think I ever saw that.
ohwilleke said:
Moreover, LHC investigators are very good at analyzing statistical issues in an experiment. They do it every single time. They have a protocol and a process for doing it. They have a lot of highly analogous computations to draw upon. So, the amount of statistical error is very likely to be correct.
Most of the work goes into systematic uncertainties. Particle physicists are very good at analyzing systematic uncertainties. They do it all the time.
ohwilleke said:
But, an ability to make an educated guess about the kind of error involved
Then "maybe a binning effect" should not be a guess in an unbinned analysis, for example.
ohwilleke said:
I'll bet that almost none of even the most savvy armchair physics analysts predicted that it was due to loose cable connections.
Correct, but a large fraction (probably the majority) expected some time measurement issue. It was immediately obvious that timing is one of the three critical points, together with the length measurement and the profile fit. An issue with the profile fit was ruled out later by the shorter bunches. There is a huge difference between "your clock synchronization could have an issue somewhere" (which happens all the time, although not always with loose cables) and "let's claim a coding error could produce a peak in the diphoton spectrum in some unspecified way, without any example or description how".
 
  • #32
kodama said:
What are the implications of a new 750GEV particle to current GUTs?
I am surprise I did not see a dark matter candidate here yet.
This should not break standard model!
There is lots more dark matter then normal matter.
If dark matter are made of particles , then there is lots more dark matter particles then normal particle or those particle are at a bigger scale.
This could explain why none of them was detected yet.
Let's imagine that this new particles , if it's one, is on the low side mass of dark matters particles.
Does current accelerator tune up for super heavy particles?
If not, then we might be missing the correct scale for them.
Is there any reason why dark matter particles could not be up and over heavy atom mass?
Why we did not see dark matters particles yet, could be only that we not look where they are. They might not be in the same scale as normal matter.

Note: By no way this should be considered as a personal theory. I am more questioning here. This particles, if is proven, look surprising heavy. Since standard model seem complete, it might be in a different field or scale. I had no idea before this particles show up of what dark matter could be made. I might be in the left field here. Did I miss something or did we miss something, that is my question.
 
  • #33
zdroide said:
I am surprise I did not see a dark matter candidate here yet.
This should not break standard model!
There is no undiscovered particle in the standard model, a 750 GeV particle would certainly be beyond the standard model. It is not a dark matter candidate - if it exists at all, it is very short-living.
zdroide said:
Does current accelerator tune up for super heavy particles?
The LHC runs at the highest energy the magnets can handle, currently 13 TeV collision energy, a factor ~6 more than the Tevatron could do, the second-most high energetic collider.
zdroide said:
Is there any reason why dark matter particles could not be up and over heavy atom mass?
Cosmology gives an approximate relation between the current dark matter density and the dark matter particle mass - something like 100 GeV is the most likely, lighter is possible, much heavier runs into problems.

It is clear that undiscovered particles are (a) very heavy or (b) couple very weakly to the known particles, otherwise we would have discovered them already. Both options are explored, (a) mainly with high-energy colliders like the LHC and (b) mainly with dedicated precision experiments.
 
  • #34
mfb said:
Cosmology gives an approximate relation between the current dark matter density and the dark matter particle mass - something like 100 GeV is the most likely, lighter is possible, much heavier runs into problems.
What else dark matter would need?
Would it been in particles form?
Some type of primordial particle?
In this case I would think of lots of stable particles of same type.
Why we not found any?
You have any idea?
Going back to this particle, it is on Susy side or multiverse?
Thank you.
 
  • #35
zdroide said:
Would it been in particles form?
It's hard to imagine scenarios where there are absolutely no associated particles. In quantum field theory, particles and fields are not different things, you can't have one without the other.
zdroide said:
Some type of primordial particle?
That is the most likely case.
zdroide said:
Why we not found any?
They don't interact via the electromagnetic or strong interaction, which makes them hard to find.
zdroide said:
Going back to this particle, it is on Susy side or multiverse?
That question does not make sense. There is no "SUSY side" and no "multiverse side".
 

What is a 750GEV particle?

A 750GEV particle is a hypothetical particle with a mass of 750 billion electron volts (GeV). It has not been observed or confirmed by experiments, but its existence has been suggested by data from the Large Hadron Collider (LHC) at CERN.

How does the existence of a 750GEV particle impact Grand Unified Theories (GUTs)?

The existence of a 750GEV particle could have significant implications for Grand Unified Theories, which aim to unify the three fundamental forces of nature (strong, weak, and electromagnetic) into one unified force. Some GUTs predict the existence of new particles, and the discovery of a 750GEV particle could provide evidence for these theories.

What are the potential implications of a 750GEV particle for the Standard Model of particle physics?

The Standard Model of particle physics is the current framework for understanding the fundamental particles and forces of nature. The discovery of a 750GEV particle could challenge the predictions of the Standard Model and potentially lead to the development of a new model that can better explain the observed data.

What experiments are being conducted to confirm or refute the existence of a 750GEV particle?

Scientists are currently conducting experiments at the LHC and other particle accelerators to search for evidence of a 750GEV particle. These experiments involve colliding particles at high energies and analyzing the resulting data for any signs of new particles or interactions that could confirm the existence of a 750GEV particle.

How could the discovery of a 750GEV particle impact our understanding of the universe?

If a 750GEV particle is confirmed to exist, it could provide new insights into the fundamental building blocks of the universe and how they interact. This could lead to a deeper understanding of the laws of nature and potentially open up new avenues for scientific research and technological advancements.

Similar threads

  • Beyond the Standard Models
Replies
4
Views
2K
  • Beyond the Standard Models
Replies
2
Views
2K
  • Beyond the Standard Models
3
Replies
74
Views
9K
  • Beyond the Standard Models
Replies
4
Views
1K
  • Beyond the Standard Models
Replies
30
Views
7K
  • Beyond the Standard Models
Replies
5
Views
2K
  • Beyond the Standard Models
Replies
30
Views
4K
  • Beyond the Standard Models
Replies
15
Views
2K
Replies
1
Views
2K
  • Beyond the Standard Models
Replies
2
Views
2K
Back
Top