I Data Collection Begins: Monitoring the Diphoton Excess

  • I
  • Thread starter Thread starter mfb
  • Start date Start date
  • Tags Tags
    Data
  • #51
6/fb are very interesting already - significantly more than the 2015 dataset which produced the excess. In May, I speculated a bit, with way too pessimistic estimates for the luminosity evolution (the schedule had less time for data-taking and the collision rate was expected to grow significantly slower).
If it was just a fluctuation, we'll probably now, if it is a particle, we'll probably know as well. The amount of data shown at ICHEP will depend on how fast ATLAS and CMS can do their analyses, but I would expect at least 6/fb, probably more.
 
  • Like
Likes Lord Crc
Physics news on Phys.org
  • #52
mfb said:
The amount of data shown at ICHEP will depend on how fast ATLAS and CMS can do their analyses, but I would expect at least 6/fb, probably more.

A little bird tells me that the ATLAS analysis is being performed with the first ~3 /fb then will be "topped up" with all data up to some cutoff date in about 2 weeks time. By my reckon it could be about 10 /fb.
 
  • #53
Aww, now it feels like I've put the commentators curse on the LHC :(

How bad is it? I just saw there was some issue with a power supply and now they're talking about reconnecting the 400kv line.
 
  • #54
Various issues prevented data-taking in the last four days, apart from a short run this morning (0.09/fb). Power supplies, water leaks, cabling problems, ...
The current estimate for the high voltage line is 20:00 (in 90 minutes). Can happen - it is an extremely complex machine, not all the components work all the time.
 
  • Like
Likes Lord Crc
  • #55
Bloody trees :(
 
  • #56
It is running again. Looks like a new luminosity record this morning, the displayed ATLAS value exceeded 95% of the LHC design luminosity.
0.2/fb collected already.

Edit: Peak luminosity was shown as 97.7% design luminosity for ATLAS (87.4% for CMS). Delivered luminosity was recorded as 0.576/fb and 0.548/fb respectively.

ATLAS now shows 7.08/fb collected data, CMS 6.77/fb. Two more days and we might have twice the 2015 dataset.They modified the injection scheme from the preaccelerators a bit, instead of 30 injections with 72 bunches each they now have 23 injections with 96 bunches each. Apparently that's still fine with the SPS vacuum, and it leads to slightly better beams.
 
Last edited:
  • Like
Likes Lord Crc and vanhees71
  • #57
The luminosity delivered to LHCb this year has now surpassed 2015.
 
  • Like
Likes mfb and vanhees71
  • #58
The LHC reached its design luminosity! The ATLAS value is shown as a bit more, the CMS value as a bit less, that is within the uncertainties of those values. The average is slightly above the design value of 10,000.

lhcdesignlumi.png
 
  • Like
Likes websterling, JorisL and ProfuselyQuarky
  • #59
The run from Sunday 17:30 to Tuesday 6:30 broke all records.

- initial luminosity: see previous post, first time the design value has been reached
- stored energy: 293 MJ in 5*1014 protons
- time in stable beams: 37 hours
- delivered luminosity in a run: 0.737/fb for ATLAS, 0.711/fb for CMS, 0.042/fb for LHCb
- delivered luminosity in 24 hours: don't know, but it is a new record as well.
- about 50% of the accelerated protons got destroyed during the run, most of them in the experiments. That's 0.4 nanogram of hydrogen.

Final luminosity was about 30% of the initial value. The machine operators dumped the beam to refill, the next run started already, with a slightly lower luminosity than the previous record run.7.7/fb data for ATLAS and CMS so far, twice the 2015 value.
 
  • Like
Likes arivero, Imager, ProfuselyQuarky and 3 others
  • #60
More than 10/fb for ATLAS and CMS, approaching three times the 2015 dataset size.

On this page, they showed an updated plot on the luminosity evolution. I extrapolated wildly again, and we are still on track for >30/fb by November 1st.

Probably a bit too optimistic as longer machine development and technical stops will come later. On the other hand, if we get lucky the preaccelerator vacuum problem gets fixed and allows higher luminosities.

Dotted green: the original plan for 2016.

lumievolution.png
 
  • Like
Likes vanhees71
  • #61
Performance in the last two weeks has been amazing. Monday->Monday set a new record with 3.1/fb collected in one week (that is nearly the same amount of data as the whole last year) while the LHC experiments could take data 80% of the time. About 12 weeks of data-taking remain for this year, 13.5/fb has been collected already (more than 3 times the 2015 dataset).

The LHC is now reliably reaching the design luminosity at the start of a run, and most runs are so long that they get dumped by the operators (instead of a technical issue) to get a new run - the number of protons goes down over time, after about 24 hours it gets more efficient to dump the rest and accelerate new protons.

The SPS vacuum issue won't get resolved this year, but there are still some clever ideas how to increase the collision rate a bit more.

lumievolution.png
 
  • Like
Likes Lord Crc and ProfuselyQuarky
  • #62
mfb said:
The SPS vacuum issue won't get resolved this year, but there are still some clever ideas how to increase the collision rate a bit more.

Slide 14 of the first set of slides from Monday's LPC meeting (http://lpc.web.cern.ch/lpc-minutes/2016-07-11.htm) has a table of the max. possible bunches (total and colliding at each point) given a train length, with and without moving the Abort Gap Keeper.

If the SPS will handle 144 bpi, we could see up to 2532b/beam, which translates to 24% increase in number of colliding bunches at ATLAS & CMS and 38% increase at ALICE and LHCb.
 
  • Like
Likes mfb
  • #63
18.3/fb (or 19.0, 19.2 or 19.4 according to other sources - I'll keep the previous one to have consistent comparisons) collected for ATLAS and CMS (compare this to ~4/fb the whole last year), the first machine development block of this year started today, stable beams will probably resume on Monday.

In the last days, the LHC implemented a "BCMS" scheme ("Batch Compression, Merging and Splitting") - a different way to prepare the bunches in the preaccelerators. As result, the beams are focused better, leading to a higher luminosity. We had several runs starting at 120% the design luminosity. The availability of the machine was great as well, so the LHC experiments could collect a huge amount of data.

I updated the luminosity plot including a paint-based extrapolation, taking into account planned downtimes. The light green line was a very early official estimate, the red line was an earlier extrapolation from me.
If the LHC can continue to run as good as in the last two weeks, it will beat even the optimistic red line extrapolation significantly.The ICHEP conference starts Wednesday next week, on Friday there are talks about the diphoton mass spectrum where we can learn more about the 750 GeV bump seen last year.
lumievolution.png
 
  • Like
Likes ProfuselyQuarky and Lord Crc
  • #64
Are any of the parallel sessions recorded and published by any chance?
 
  • #65
I was also thinking about the Higgs boson, any hints that there might be something interesting or unexpected there?
 
  • #66
Lord Crc said:
Are any of the parallel sessions recorded and published by any chance?

I didn't see anything on the conference website 38th International Conference on High Energy Physics about the sessions being recorded or streamed. But it did say that the Proceedings will be publicly available. Also, presented papers tend to show up on the arXiv.
 
  • Like
Likes Lord Crc
  • #67
Typically the slides are made available quickly after the sessions (sometimes before, but then people go through the slides instead of listening to the talks even more), with the corresponding arXiv uploads a bit before or after that.

websterling said:
But it did say that the Proceedings will be publicly available.
Proceedings appear several months later - at a point where all those analyses updated to the full 2016 dataset already, and no one cares about proceedings any more.
 
  • Like
Likes Lord Crc
  • #68
Yeah, I think they were recording only the Plenary speakers at the last ICHEP, our parallel sessions were not. But usually even the video takes time to be put on the website. A lot of the slides, especially those with preliminary experimental results, might not be made public either.
 
  • #69
Thanks for the info, guess I've been a bit spoiled by pirsa :)
 
  • #70
Since the Abort Gap Keeper was moved and verified before the MD period, the filling scheme should change at some point this week to one with 2200 bunches per beam (up from 2076).

This will mean a 9% increase in number of colliding bunches at ATLAS and CMS, 16% at ALICE and 18% at LHCb
 
  • Like
Likes mfb
  • #71
Machine development is over, but all they got in the last 2.5 days were two short runs. It cannot work always as nicely as we had it in the last weeks.

The initial problem was from communication in the cryogenic system, afterwards one magnet did not work properly. This morning the preaccelerators needed some intervention - now fixed, but the magnet still has problems.

Once the magnet is running again, they try to quickly go to a larger number of bunches (see the post by dukwon).

Another thing that will be tested is luminosity leveling - LHCb uses it already, ATLAS and CMS want to use it later: The beams are deliberately collided with some position offset to reduce the interaction rate to something the detectors can reasonably process. Currently ATLAS and CMS are interested in as many collisions as the machine can give them (up to ~40 per bunch crossing), but with the high-luminosity upgrade they will need this leveling procedure to limit the collisions per bunch crossing to about 150 while the machine could achieve something like 250. LHCb is designed for a lower luminosity, they have been running with luminosity leveling since 2011.
 
  • Like
Likes Lord Crc
  • #72
Next fill will be the first to take advantage of the new Abort Gap Keeper position. 2173b @ 96bpi
 
  • Like
Likes Lord Crc and mfb
  • #73
Screenshot from 2016-08-16 15-49-01.png


From the most recent LPC meeting...
 
  • #74
I saw that ;).
21.6 or 23.7/fb delivered to ATLAS and CMS, depending on which number you trust. Data-taking after MD was slower than before due to various issues, but it is still better than the 2016 projection.

The LHC is now running with 2220 bunches per beam, but at a lower number of protons per bunch. One magnet seems to have an electrical problem inside and could get damaged if there is a quench, so they are very careful about that magnet now. If it gets damaged, a replacement could easily take two months, which basically means the end of data-taking for this year.
 
  • Like
Likes Lord Crc
  • #75
Just a question.. if the LHC has a proton collision energy of 13 TeV. What is the maximum energy of particles it can produce. Half or 6.5 TeV only? or smaller.
 
  • #76
cube137 said:
Just a question.. if the LHC has a proton collision energy of 13 TeV. What is the maximum energy of particles it can produce. Half or 6.5 TeV only? or smaller.

the actual collision energy is less than 13TeV (what collides are the quarks or gluons, and they carry a fraction x of the proton's energy/momentum).
It depends on what is produced... in general the collision will produce several particles, and so the energies are not "fixed" by the energy-momentum conservation, as they are in the case you produce just two particles... some can be higher some lower.
 
  • #77
ChrisVer said:
the actual collision energy is less than 13TeV (what collides are the quarks or gluons, and they carry a fraction x of the proton's energy/momentum).
It depends on what is produced... in general the collision will produce several particles, and so the energies are not "fixed" by the energy-momentum conservation, as they are in the case you produce just two particles... some can be higher some lower.

Are they high enough to detect KK particles which may weight up to 2 GeV or do we have to wait another 20 years and $10 billion dollars for the China collider?
 
  • #78
A reaction like gluon+gluon -> particle with 10 TeV mass is not impossible (if such a particle exists), but incredibly unlikely as (a) the gluons would have to carry a large fraction of the total energy of both protons and (b) single particle production is always problematic in terms of phase space and conserved quantities.

As far as I know, the searches for black holes (which are not actual particles, but share many of their properties) are the only searches with exclusion limits above 6.5 TeV.
Here is a summary of CMS run 1 exclusion limits, the corresponding ATLAS plots look similar: no exclusion limit beyond 4 TeV as the production rate would be too tiny to see it. Only black holes have a huge production rate at high masses if they are possible at the LHC energies.Currently there are some problems with the magnets of ALICE and ATLAS.
 
Last edited:
  • #79
mfb said:
A reaction like gluon+gluon -> particle with 10 TeV mass is not impossible (if such a particle exists), but incredibly unlikely as (a) the gluons would have to carry a large fraction of the total energy of both protons and (b) single particle production is always problematic in terms of phase space and conserved quantities.

As far as I know, the searches for black holes (which are not actual particles, but share many of their properties) are the only searches with exclusion limits above 6.5 TeV.
Here is a summary of CMS run 1 exclusion limits, the corresponding ATLAS plots look similar: no exclusion limit beyond 4 TeV as the production rate would be too tiny to see it. Only black holes have a huge production rate at high masses if they are possible at the LHC energies.

I've been googling or researching about the searches for KK particles for the past two hours.. what are the mass or TeV range already investigated? Are KK particles still possible to be found by the LHC?
 
  • #80
Randall-Sundrum Gravitons would form such a structure. The diphoton peak appeared in a search for those particles. There are also searches for heavier versions of a Z or W, those could be new particles or KK-like heavier states of the Z/W. I guess searches for excited quark states are also similar to this.
 
  • #81
I am not sure if the W' has been used for such a search (because they have to come from models that have an extra SU(2) )... Z' though have (I guess because they can be connected to just a U(1) that comes from string theories and so on).
I may be wrong though...
 
Last edited:
  • Like
Likes vanhees71
  • #82
I would expect the W to have heavier partners as well if the other particles have them, but I'm not sure.
 
  • #83
A E6 gauge group for example (that can appear from string-theory low-energy phenomenology http://www.sciencedirect.com/science/article/pii/0370157389900719 ) can break down into an SU(5) group + two additional U(1) groups http://journals.aps.org/prd/pdf/10.1103/PhysRevD.34.1530 ... (the reason I said that I cannot be sure is because all these can be very model-dependent, for example you can have the SU(4)xSU(2)xSU(2) which can bring a W').
Those U(1) can predict heavier Z' gauge bosons, but not W'.
I guess people then adopt the last paper's notation when they search for them and denote them with Z_\psi and Z_\chi.
 
Last edited by a moderator:
  • #84
The appealing thing with additional U(1) gauge fields is that they can be massive gauge fields without additional Higgs particles in the physical spectrum. In the Abelian case you can just have a naive mass term without violating gauge invariance by introduing an additional scalar field, the Stueckelberg ghost. The point is that in the Abelian case this Stueckelberg ghost decouples completely from the dynamics (as also the Faddeev-Popov ghosts in the Abelian case). At finite temperature, the ghosts are important since they give the correct counting for the bosonic degrees of freedom: You have four gauge-field degrees of freedom and one Stueckelberg ghost (which is quantized as a true boson, i.e., as a c-number field in the path integral) and two Faddeev-Popov ghosts (which are quantized as pseudo-fermionic fields to provide the determinant of the gauge transformation in the Faddeev-Popov formalism). So together you have 4+1-2=3 physical and true bosonic degrees of freedom, corresponding to the three physical spacelike degrees of freedom of a massive vector field.
 
  • #85
mfb said:
Randall-Sundrum Gravitons would form such a structure. The diphoton peak appeared in a search for those particles. There are also searches for heavier versions of a Z or W, those could be new particles or KK-like heavier states of the Z/W. I guess searches for excited quark states are also similar to this.

Randall-Sundrum RS1 and RS2 warped and extra dimensions including different sizes have different unique KK particles signatures.. I've been looking for the KK particles already excluded by current and past LHC searches.. what websites summarize the KK particles already excluded (as well as the corresponding dimensional stuff)? Thanks.
 
  • #86
cube137 said:
Randall-Sundrum RS1 and RS2 warped and extra dimensions including different sizes have different unique KK particles signatures.. I've been looking for the KK particles already excluded by current and past LHC searches.. what websites summarize the KK particles already excluded (as well as the corresponding dimensional stuff)? Thanks.
Well, if there exist any such public result from either ATLAS or CMS, you can find it in their "Exotics" results:
https://twiki.cern.ch/twiki/bin/view/AtlasPublic/ExoticsPublicResults
https://twiki.cern.ch/twiki/bin/view/CMSPublic/PhysicsResultsEXO
most of the times the papers refer to the "signatures" they search for (for example charged lepton + MET , or dilepton*, or jet+stuff and so on) and maybe in keywords like "extra dimensions".
Also the pdg review on the topic you are interested in (eg http://pdg.lbl.gov/2015/reviews/rpp2015-rev-extra-dimensions.pdf) contains information on the searches for those particles, so you could check it out (and the references therein)

*Here I use "leptons" to refer to all SM leptons: eμτ... most of the times, leptons in the paper titles refer to light leptons (e and μ) like here http://arxiv.org/abs/1407.2410 and if the search is τ-specific, the taus are used in the title...(mainly due to the differences between e/μ and τ signals)
 
Last edited:
  • Like
Likes mfb
  • #87
mfb said:
A reaction like gluon+gluon -> particle with 10 TeV mass is not impossible (if such a particle exists), but incredibly unlikely as (a) the gluons would have to carry a large fraction of the total energy of both protons and (b) single particle production is always problematic in terms of phase space and conserved quantities.

As far as I know, the searches for black holes (which are not actual particles, but share many of their properties) are the only searches with exclusion limits above 6.5 TeV.
Here is a summary of CMS run 1 exclusion limits, the corresponding ATLAS plots look similar: no exclusion limit beyond 4 TeV as the production rate would be too tiny to see it. Only black holes have a huge production rate at high masses if they are possible at the LHC energies.Currently there are some problems with the magnets of ALICE and ATLAS.

Would like to verify something. At the bottom of the paper you shared above is the description: "Summary of CMS limits on new physics particle-masses/scales in different BSM searches". What does it mean? Are the bars those already tested or the capability of the machine.? For example. In the "RS Gravitons" in the second line RS1(ee,uu), k=0.1 with bar reaching 2.75 TeV. Is it 2.75TeV the capability of the machine or energy already tested??
 
  • #88
cube137 said:
Are the bars those already tested or the capability of the machine.?
tested and excluded, so if they exist they have mass above the written in the bar.
 
  • #89
ChrisVer said:
tested and excluded

But in Compositeness, there is a bar in the dielectrons, A+ LUM which has bar reaching 18.3 TeV. But the LHC has only 14 TeV hadron collision energy.. can the components of the debris have more energy than 14 TeV?
 
  • #90
cube137 said:
But in Compositeness, there is a bar in the dielectrons, A+ LUM which has bar reaching 18.3 TeV. But the LHC has only 14 TeV hadron collision energy.. can the components of the debris have more energy than 14 TeV?
Well , we don't have 14TeV yet... and I don't know about those high masses... I think it can be possible depending on the actual particle/model... for example if those particles existed with masses below 18TeV they might affect some observable we have at the accessible energies.
what I am sure about is that the "Heavy gauge bosons" show the actual limits observed... well some don't make sense at all (eg in the W'->τν CMS got 3.3TeV in their latest release, but in the W'->(e/μ)ν they got 4.4TeV , so I don't understand why their bar for SSM W' is at 3.3TeV)
 
Last edited:
  • #92
ChrisVer said:
Well, if there exist any such public result from either ATLAS or CMS, you can find it in their "Exotics" results:
https://twiki.cern.ch/twiki/bin/view/AtlasPublic/ExoticsPublicResults
https://twiki.cern.ch/twiki/bin/view/CMSPublic/PhysicsResultsEXO
most of the times the papers refer to the "signatures" they search for (for example charged lepton + MET , or dilepton*, or jet+stuff and so on) and maybe in keywords like "extra dimensions".
Also the pdg review on the topic you are interested in (eg http://pdg.lbl.gov/2015/reviews/rpp2015-rev-extra-dimensions.pdf) contains information on the searches for those particles, so you could check it out (and the references therein)

*Here I use "leptons" to refer to all SM leptons: eμτ... most of the times, leptons in the paper titles refer to light leptons (e and μ) like here http://arxiv.org/abs/1407.2410 and if the search is τ-specific, the taus are used in the title...(mainly due to the differences between e/μ and τ signals)

Do you or does anyone have any idea what's the maximum TeV before the RS1 and ADD models were totally excluded? or is there no limit even reaching up to 50 TeV in future colliders?
 
  • #93
A direct production of real particles is not the only way you can find new physics.

For all searches, more data allows to set better exclusion limits, and the increased energy of run2 helps massively in nearly all searches.
cube137 said:
Do you or does anyone have any idea what's the maximum TeV before the RS1 and ADD models were totally excluded? or is there no limit even reaching up to 50 TeV in future colliders?
They could appear anywhere, including millions of TeV. But the nice features of the theory go away if they are not reasonably close to the scale of electroweak symmetry breaking.
 
  • #94
mfb said:
A direct production of real particles is not the only way you can find new physics.

For all searches, more data allows to set better exclusion limits, and the increased energy of run2 helps massively in nearly all searches.
They could appear anywhere, including millions of TeV. But the nice features of the theory go away if they are not reasonably close to the scale of electroweak symmetry breaking.

What range for you is this "reasonably close" to the scale of EWSB.. maybe from 1 TeV to 20 TeV or 1 TeV to 70 TeV? or 1 TeV to 3Tev?
 
  • #95
There is no fixed limit, higher masses just make the theories less and less plausible. If the LHC doesn't find anything with its full dataset (~2035), then I would expect many theorists to look for new approaches.
 
  • #96
mfb said:
There is no fixed limit, higher masses just make the theories less and less plausible. If the LHC doesn't find anything with its full dataset (~2035), then I would expect many theorists to look for new approaches.

You mean up to year 2035? That's very long! It's only 2016 now.. that's still 19 years to go.. or did you mean up to 2035 TeV?
 
  • #97
cube137 said:
year 2035
yup
Well LHC was not built to work for 3-4 years.
 
  • #98
ChrisVer said:
yup
Well LHC was not built to work for 3-4 years.

But just within 1 year of run2.. LHC has already excluded say up to 2.8 TeV for RS1 warped dimension model.. why would it need 19 more years when it's limit is only up to 13 TeV hadron collision energy. Or were you saying that they need to look at the data for the next 19 yrs and all those supersymmetric particles can suddenly become visible say 7 years from now? Please clarify. Thank you.
 
  • #99
Here is the current schedule (page 2)

We collected about 4/fb for ATLAS and CMS each in 2015.
This year we should get between 30 and 40, 2017 and 2018 probably another 40 to 50 each, for a combined dataset of ~100-150/fb.
Then two years of shutdown for improvements to LHCb and ALICE and various machine components. If it doesn't happen earlier, we can probably go to 14 TeV afterwards.
2021-2023 the experiments hope for more than 50/fb per year, for a total of ~300/fb.
2024-2026 the machine will upgraded to the High-Luminosity (HL) LHC, pushing the collision rate to about 7 times the current value from 2027 on (with a shorter break in 2031), ATLAS and CMS get major upgrades as well. That should allow to collect about 300/fb per year to have about 3000/fb by 2035.Larger datasets allow to increase the exclusion limits, but also to make them harder: you can often tune the signal strength in a model (the exclusion limits are then given for a fixed signal strength), and to find weaker signals you simply need more data.
 
  • Like
Likes vanhees71
  • #100
cube137 said:
why would it need 19 more years when it's limit is only up to 13 TeV hadron collision energy.
it's not only the energies that matter, but the amount of data...
With the 2015 dataset at sqrt(s)=13TeV with 3.2fb^- luminosity the limit of an exotic particle was at ~4.0TeV
With the 2016 dataset at sqrt(s)=13TeV (same energy) with ~13.3fb^- the limit went at ~4.7TeV

Also read about top's discovery.
 
  • Like
Likes vanhees71

Similar threads

Replies
13
Views
4K
Replies
57
Views
15K
Replies
9
Views
190
Replies
49
Views
12K
Replies
3
Views
3K
Back
Top