LHC Part 4: Searching for New Particles and Decays - Comments

In summary, mfb has submitted a new PF Insights post discussing the search for new particles and decays at the LHC. The article explains the concept of look-elsewhere-effect and compares it to similar effects in other scientific fields. The conversation also touches on the process of analyzing the massive amount of data collected at the LHC and the recent release of datasets from ATLAS and CMS. It is noted that while the most recent datasets are not yet publicly available, there are ongoing studies with 7 and 8 TeV data. The conversation also mentions the importance of precision measurements and the timelines for data analysis and release.
  • #1
37,127
13,968
mfb submitted a new PF Insights post

LHC Part 4: Searching for New Particles and Decays

LHC4.png


Continue reading the Original PF Insights Post.
 
  • Like
Likes Ygggdrasil, QuantumQuest, Orodruin and 2 others
Physics news on Phys.org
  • #2
Excellent article with very clear explanations.
By looking at more places, we made it more likely to see larger statistical fluctuations. This is called look-elsewhere-effect or trials factor.
This is also known as the Green Jelly Bean effect in the medical sciences or p-hacking in the social sciences.
A really weird statistical fluctuation, a new particle, or some weird experimental effect?
Any guesses at this point, or should we all wait for Friday to see what the additional analyses have turned up?
 
  • #3
Thanks :)
Ygggdrasil said:
Any guesses at this point, or should we all wait for Friday to see what the additional analyses have turned up?
Wait for Friday. There are various rumors around, I won't comment on them.

I'll post results here as soon as they are public.

Found this nice description by CERN, slightly different focus but a large overlap in the topics. With more pictures!
 
Last edited:
  • Like
Likes vanhees71
  • #5
Great overview.as a quantum gravity guy trying to learn more about phenomenology your articles are just perfect!
 
  • Like
Likes Preetica and mfb
  • #6
What kind of data files and analytics software are you guys using to dig through 2.5+ quadrillion collision events?
 
  • #7
stoomart said:
What kind of data files and analytics software are you guys using to dig through 2.5+ quadrillion collision events?
That's why triggers are used; to decrease the rate of collecting events to a handleable size [not on a local computer of course]. https://inspirehep.net/record/1196429/files/soft-2004-007.pdf
For local computers the sizes you're dealing with depend on the number of the recorded data and the analysis you are doing.
 
  • Like
Likes stoomart
  • #8
The experiments start with a 40 MHz bunch crossing rate. At ~2 MB/event (ATLAS/CMS, lower for LHCb) that is 80 TB/s. You cannot even read out such a data rate. The experiments read out a small part and look for the most interesting collisions there (mainly looking for high-energetic processes). That reduces the event rate to ~100 kHz (ATLAS/CMS) or 1 MHz (LHCb). 200 GB/s are then fed into computer farms and analyzed in more detail. Again the data is reduced to the most interesting events, ~1 kHz for ATLAS/CMS and ~10 kHz for LHCb. Those are stored permanently. The information which possible physics process happened there (e. g. "the reconstruction found two high-energetic electrons") is also stored.

Individual analyses can then access those datasets. As an example, an analysis could look for events with two high-energetic electrons: Those might have a rate of 3 Hz during data-taking, which means you have something like 12 million events (~20 TB for ATLAS/CMS). That number varies a lot between analyses, some have just a few thousand, some have 100 millions. Those events are then processed by the computing grid, typically producing a smaller dataset (gigabytes) with just the information you care about. The GB-sized files are typically .root files and studied with C++ or python on single computers or a few computers at a time. Everything before that is code and data formats developed for the individual experiments.

ALICE has much lower event rates, so the earlier steps are easier there, the later steps look very similar.
 
  • Like
Likes Preetica and stoomart
  • #9
Sounds like a similar data pipeline to what I use for mining network and host events for interesting security events, though on a much larger scale (I'm currently condensing 20-30 million filtered and stored events per day into ~30 "interesting" events). Thanks for the link @ChrisVer, I'm definitely going to read through it and maybe even get me some LHC data to play around with. Thanks guys!
 
  • #10
I think some datasets became available to public last year? I think if you search for it, you may find a way to access them without having to be a member of the collaboration, and they should be easy to deal with on a local machine.
 
  • #11
Both ATLAS and CMS released older datasets, LHCb has some tiny example datasets but will publish more soon, ALICE will follow as well.
ATLAS
CMS

The full released CMS dataset has 300 TB. Dealing with that on a local machine can be "tricky," but they also have smaller subsamples. I don't know the size of the ATLAS data, but I would expect it to be similar.
Both datasets come with additional simulation samples necessary to understand the detector better.
 
  • Like
Likes stoomart
  • #12
Unfortunately it looks like the TAG data (event metadata) that I'm interested in analyzing is only stored in a relational database, which is not available for download.
 
  • #13
Very nice and understandable article, thanks @mfb! I did not read it until today, but better late than never.
 
  • #14
Its good they released ATLAS datasets but that is only 8 Tev. We all know the fun stuff happens past 10 Tev. Meanwhile we anxiously await for ALICE. An event happens they just don't have capabilities to interpret dataset.
 
  • #15
Tommy 101 said:
but that is only 8 Tev.
I am not sure, but I guess that's the point. Data that have been thoroughly studied should be accessible to groups or people outside the collaboration. Data is always useful [educationally or even for researchers and some theorists].
Also don't underestimate the 8TeV, studies are still being done on those samples... afterall, ATLAS is not just a machine dedicated to search for new physics, but it studies the Standard Model too (cross sections, polarizations etc).

Tommy 101 said:
We all know the fun stuff happens past 10 Tev
Do we? it may even start past 20TeV.

Tommy 101 said:
Meanwhile we anxiously await for ALICE. An event happens they just don't have capabilities to interpret dataset.
I don't understand your latest sentence here, could you make it clearer?
 
  • #16
The recent W mass measurement was done with 7 TeV data, and there are various studies with 7 and 8 TeV ongoing. Precision measurements take time.

The most recent datasets are not made public yet because the collaborations that built the detectors and did all the work to gather the data also want to analyze it first. This is not particle-physics specific. A group doing some laser physics or whatever also doesn't release raw data before writing a publication about the results. Chances are good you'll never see the raw data for most experiments. In particle physics, you do.
 
  • #17
Someone made a video about the same topic, and with nice animations.
 
  • Like
Likes DennisN and stoomart

Similar threads

  • High Energy, Nuclear, Particle Physics
Replies
11
Views
1K
  • High Energy, Nuclear, Particle Physics
Replies
23
Views
3K
  • High Energy, Nuclear, Particle Physics
Replies
8
Views
2K
  • High Energy, Nuclear, Particle Physics
Replies
1
Views
1K
  • High Energy, Nuclear, Particle Physics
Replies
33
Views
6K
  • High Energy, Nuclear, Particle Physics
Replies
4
Views
2K
  • Sticky
  • High Energy, Nuclear, Particle Physics
Replies
2
Views
4K
Replies
2
Views
774
  • High Energy, Nuclear, Particle Physics
2
Replies
49
Views
9K
  • High Energy, Nuclear, Particle Physics
2
Replies
48
Views
6K
Back
Top