Data Collection Begins: Monitoring the Diphoton Excess

  • I
  • Thread starter mfb
  • Start date
  • Tags
    Data
In summary: But at lower energies, they come out more often, and they're more energetic. This means that they can remove imperfections from the beam pipe, which is not something you want happening too often. So, they do a thing called "scrubbing". Basically, they fill the ring with a bunch of particles, and when they hit the beam pipe, some of them are accelerated and removed. It takes about 3 to 4 days, but it's worth it, because it keeps the beam pipe clean.
  • #106
cube137 said:
If we will have a poll.. how many percentage of physicists here in Physicsforums agree with the above and how many agree that Supersymmetry and other major findings can still be found at 100 TeV like Lubos who is a string theorist forever.

Sure it can be found. But it will become harder to convince funding agencies that "the next bump in energy will show experimental results for sure".
I've only worked close with my advisor and the way I read him was as a pragmatic physicist. In fact quite a bit of his work is to exclude possible avenues including the celebrated KKLT result.

Rule number one: confirming your theory is nice, something absolutely unexpected is even more fun!

Re Lubos; I think he gives some nice insights for some articles but he's too harsh, same as Woit (IMHO).
Discussing articles on a blog without too much technical details calls for a lot of nuance.
 
Physics news on Phys.org
  • #107
We are in another block of machine development / technical stop / special runs at low luminosity now. ATLAS and CMS collected nearly 30/fb, and four weeks of data-taking are left. The schedule got moved a bit, data-taking now ends a few days earlier (October 25th instead of November 1st), but with the recent performance we can still expect about 40/fb in total, much more than the original expectation of 25/fb. More than three times the amount of data analyzed for ICHEP in August. I expect that we get some results of the full 2016 dataset around the end of the year.

https://espace.cern.ch/be-dep/BEDepartmentalDocuments/BE/LHC_Schedule_2016.pdf [Broken]

Luminosity evolution, green actual data, dotted green the earlier expectation, red an early extrapolation:

lumievolution.png
 
Last edited by a moderator:
  • Like
Likes Lord Crc, ProfuselyQuarky and hsdrop
  • #108
Some improvements in the procedure allowed even higher luminosities in the past weeks, up to 150% the design luminosity as peak, and reliably above than 130% at the start of runs. The values between ATLAS and CMS diverge again, this time CMS shows notably higher values, it is unclear if they actually have more collisions. Various planned fixes and upgrades should allow to get even higher luminosities next year.Last week included a "high pile-up test": As many proton-proton collisions per bunch crossing as possible. They reached 90-95, while the design value is about 25 and the current regular runs have about 35-40 as maximum. The high values per bunch crossing came at the price of just a few bunches with this performance - not suitable for the current operation, 2200 bunches with 40 collisions each are much better than 50 with 90 collisions. The test gives the experiments a better idea how the next years might look like. The HL-LHC upgrade in ~2023-2026 will then lead to 140-200 collisions per bunch crossing.A bit more than one week left for proton-proton collisions, then some machine development, mid November proton-lead collisions will start (3 weeks). Those collisions are an important control sample to understand the lead-lead collisions better: do they look like a collection of 208 separate nucleon-lead collisions, or which new things do they show? We had a similar run in 2013 already, but at lower energies.

The option to study those collisions is a lucky side-product of the design: Both beams have the same magnetic field strength in the bending magnets. This means the different particle types have the same momentum per charge. Protons have a mass of 1 u per electric charge, while lead has 208 u and 82 charges, a ratio of ~2.53. More mass per momentum means the lead ions are slower: If a bunch of protons collides with a bunch of lead ions at a collision point, and the bunch of protons goes around the ring once, the bunch of lead ions is not there yet, and the collision position would shift all the time. Oops.
Two features make the collisions possible: The LHC has much more energy than any previous collider. More energy means all speeds are extremely close to the speed of light, and speed differences are smaller. The second feature is the decision to have proton-proton collisions (instead of proton-antiproton): the two beams need their magnetic fields in opposite directions, which means they need separate beam pipes. This allows to steer the beams separately better - the lead ions can get the "inside curve", an orbit just a millimeter shorter over the circumference of 27 km - sufficient to keep them synchronized with the protons. At the injection energy, the necessary difference would be 40 cm - way too large to fix this. The LHC has to accelerate proton and lead to let them collide.Luminosity evolution. The red line is my extrapolation from July 6th. CMS quotes 37.5/fb, ATLAS 34.6/fb, I plotted both. A huge dataset - I'm looking forward to first results at the end of the year!

lumievolution.png
 
Last edited:
  • Like
Likes Lord Crc
  • #109
The last run got dumped half an hour ago. No more proton-proton collisions this year. CMS quotes 41.5/fb, ATLAS 38.5/fb, LHCb 1.9/fb. For nearly every analysis, and for all three experiments*, the data collected since ICHEP exceeds the dataset collected over all the years before.

We'll probably see results during Moriond in March, maybe earlier if something really exciting comes up.

* for ALICE, the proton-proton collisions are mainly a control sample, they care more about the lead collisions.Next: Two weeks of technical work, then three weeks proton-lead collisions, followed by the end-of-year shutdown, with proton-proton collisions to resume in May 2017.
 
  • Like
Likes hsdrop, Lord Crc and vanhees71
  • #110
When's the next AA run BTW?
 
  • #111
Probably next year, but I don't think it's been scheduled.
 
  • Like
Likes vanhees71
  • #112
Currently there is no heavy ion run planned for 2017, we should get another lead-lead run in 2018.
Long-term schedule.
 
  • Like
Likes vanhees71
  • #113
The LHC is running luminosity scans now. This doesn't add to the dataset, but it does improve the precision of some measurements.
 
  • #114
Why no lead-lead in 2016 and 2017? Not expecting much exciting results from it beyond the 2015 data?
 
  • #115
Proton-lead collisions are important as well, both at the same time doesn't work and switching between different modes takes some days. It is more efficient to make just proton-lead (pPb) or lead-lead (PbPb) but not both within a year. I don't know why there is no lead-lead run planned for 2017. One argument could be the longer than usual shutdown this winter, proton-proton collisions would have a shorter time if there would be an additional heavy ion run at the end.
 
  • Like
Likes Lord Crc
  • #116
I guess there is some truth in the idea of #114. The previous pPb results are much more puzzling than the Pb Pb results obtained so far. Particularly, why in p Pb so many "collective phenomena", quite well describable by hydrodynamics, despite the large gradients involved, is quite exciting. So indeed, the pA runs are at least as exciting as new AA runs might be. Also for some Pb Pb results you don't only need the pp baseline but also the p Pb baseline (i.e., the "cold-nuclear matter" initial-state effects vs. the full hot-medium, including QGP, effects of the Pb Pb collision). E.g., to learn about heavy-quarkonium suppression vs. regeneration you need to know the initial-state effects like Cronin, shadowing etc. from the p Pb.
 
  • Like
Likes Lord Crc
  • #117
The current schedule has more PbPb than pPb in run 2. We had PbPb last year, now we have pPb - nothing unexpected so far. They could collide PbPb again in 2017 and pPb in 2018, but the current schedule does not have more pPb collisions in run 2.
 
  • #118
mfb said:
I don't know why there is no lead-lead run planned for 2017. One argument could be the longer than usual shutdown this winter, proton-proton collisions would have a shorter time if there would be an additional heavy ion run at the end.

There's that, and I'm pretty sure Linac3 is switching to Xenon for that year
 
  • #119
And not switching back? That would be another reason. On the other hand, collisions with other types of ions were considered for the LHC as well.
 
  • #120
Potentially silly question: with the p-Pb run under way I'm looking at Vistars and wondering why the instantaneous luminosity of ALICE has such great fluctuations compared to the other detectors.

Is it just due to each collision having a much wider range of results depending on "how well" each proton hits the nucleus? I'm thinking bowling here.
 

Attachments

  • tmp_29022-Screenshot_2016-11-14-11-50-38-1754626351.png
    tmp_29022-Screenshot_2016-11-14-11-50-38-1754626351.png
    21 KB · Views: 420
  • #121
The luminosity just depends on beam parameters, not on details of the collisions (which happen in the kHz range anyway, not on the timescale of those fluctuations). I don't know where the fluctuations come from - could be some calibration issue with the measurement, or very frequent changes of the beam overlap by the machine operators.
The LHC registered the earthquake in New Zealand. It lead to a small deformation of the ring which changes the beam energy a tiny bit. This is the result. The long-term sine modulation are the tides. They are quite strong because we are close to a full moon.
 
  • Like
Likes OmCheeto
  • #123
Lord Crc said:
Potentially silly question: with the p-Pb run under way I'm looking at Vistars and wondering why the instantaneous luminosity of ALICE has such great fluctuations compared to the other detectors.

Is it just due to each collision having a much wider range of results depending on "how well" each proton hits the nucleus? I'm thinking bowling here.

This is what luminosity levelling looks like when you zoom in on the y-axis scale.

Here's an example when they tried levelling ATLAS and CMS

fqiFbOZ.png
 
  • Like
Likes Lord Crc and mfb
  • #124
mfb said:
The luminosity just depends on beam parameters, not on details of the collisions (which happen in the kHz range anyway, not on the timescale of those fluctuations). I don't know where the fluctuations come from - could be some calibration issue with the measurement, or very frequent changes of the beam overlap by the machine operators.

As the song goes, I should have known better... :)

mfb said:
The LHC registered the earthquake in New Zealand. It lead to a small deformation of the ring which changes the beam energy a tiny bit. This is the result. The long-term sine modulation are the tides. They are quite strong because we are close to a full moon.

Really interesting, thanks for sharing.
 
<h2>1. What is the diphoton excess and why is it significant?</h2><p>The diphoton excess is a statistical anomaly observed in the Large Hadron Collider (LHC) data, where there is an unexpectedly high number of events where two photons are produced. This could potentially be a sign of new physics beyond the Standard Model. It is significant because it could lead to a better understanding of the fundamental building blocks of our universe.</p><h2>2. How is data collected in the LHC?</h2><p>Data is collected in the LHC through the use of detectors, which are specialized instruments that measure the properties of particles produced in collisions. These detectors are made up of different layers that each perform a specific function, such as tracking the paths of particles or measuring their energy and momentum.</p><h2>3. What is the role of monitoring in data collection?</h2><p>Monitoring is crucial in data collection as it allows scientists to continuously check the performance of the detectors and ensure that the data being collected is of high quality. This helps to identify any issues or anomalies that may arise, such as background noise or malfunctioning equipment, and allows for adjustments to be made in real-time.</p><h2>4. How is the diphoton excess being studied?</h2><p>The diphoton excess is being studied through the analysis of the data collected by the LHC detectors. Scientists are looking for patterns and trends in the data that could indicate the presence of new particles or interactions. This involves using statistical methods and comparing the data to theoretical predictions.</p><h2>5. What are the potential implications of the diphoton excess?</h2><p>If the diphoton excess is confirmed to be a real signal and not just a statistical fluctuation, it could lead to the discovery of new particles or interactions that are not predicted by the Standard Model. This could greatly advance our understanding of the fundamental laws of nature and potentially open up new avenues for research and technology.</p>

1. What is the diphoton excess and why is it significant?

The diphoton excess is a statistical anomaly observed in the Large Hadron Collider (LHC) data, where there is an unexpectedly high number of events where two photons are produced. This could potentially be a sign of new physics beyond the Standard Model. It is significant because it could lead to a better understanding of the fundamental building blocks of our universe.

2. How is data collected in the LHC?

Data is collected in the LHC through the use of detectors, which are specialized instruments that measure the properties of particles produced in collisions. These detectors are made up of different layers that each perform a specific function, such as tracking the paths of particles or measuring their energy and momentum.

3. What is the role of monitoring in data collection?

Monitoring is crucial in data collection as it allows scientists to continuously check the performance of the detectors and ensure that the data being collected is of high quality. This helps to identify any issues or anomalies that may arise, such as background noise or malfunctioning equipment, and allows for adjustments to be made in real-time.

4. How is the diphoton excess being studied?

The diphoton excess is being studied through the analysis of the data collected by the LHC detectors. Scientists are looking for patterns and trends in the data that could indicate the presence of new particles or interactions. This involves using statistical methods and comparing the data to theoretical predictions.

5. What are the potential implications of the diphoton excess?

If the diphoton excess is confirmed to be a real signal and not just a statistical fluctuation, it could lead to the discovery of new particles or interactions that are not predicted by the Standard Model. This could greatly advance our understanding of the fundamental laws of nature and potentially open up new avenues for research and technology.

Similar threads

  • Sticky
  • High Energy, Nuclear, Particle Physics
Replies
28
Views
7K
  • High Energy, Nuclear, Particle Physics
4
Replies
109
Views
16K
  • High Energy, Nuclear, Particle Physics
2
Replies
69
Views
11K
  • High Energy, Nuclear, Particle Physics
2
Replies
49
Views
9K
  • Beyond the Standard Models
Replies
30
Views
7K
  • Programming and Computer Science
Replies
3
Views
2K
  • Sci-Fi Writing and World Building
Replies
21
Views
855
Replies
1
Views
1K
  • Programming and Computer Science
Replies
1
Views
1K
  • Astronomy and Astrophysics
Replies
10
Views
5K
Back
Top