Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Insights LHC Part 4: Searching for New Particles and Decays - Comments

  1. Aug 1, 2016 #1

    mfb

    User Avatar
    2016 Award

    Staff: Mentor

  2. jcsd
  3. Aug 1, 2016 #2

    Ygggdrasil

    User Avatar
    Science Advisor

    Excellent article with very clear explanations.
    This is also known as the Green Jelly Bean effect in the medical sciences or p-hacking in the social sciences.
    Any guesses at this point, or should we all wait for Friday to see what the additional analyses have turned up?
     
  4. Aug 1, 2016 #3

    mfb

    User Avatar
    2016 Award

    Staff: Mentor

    Thanks :)
    Wait for Friday. There are various rumors around, I won't comment on them.

    I'll post results here as soon as they are public.

    Found this nice description by CERN, slightly different focus but a large overlap in the topics. With more pictures!
     
    Last edited: Aug 4, 2016
  5. Aug 4, 2016 #4

    mfb

    User Avatar
    2016 Award

    Staff: Mentor

  6. Sep 9, 2016 #5

    haushofer

    User Avatar
    Science Advisor

    Great overview.as a quantum gravity guy trying to learn more about phenomenology your articles are just perfect!
     
  7. Dec 19, 2016 #6
    What kind of data files and analytics software are you guys using to dig through 2.5+ quadrillion collision events?
     
  8. Dec 19, 2016 #7

    ChrisVer

    User Avatar
    Gold Member

    That's why triggers are used; to decrease the rate of collecting events to a handleable size [not on a local computer of course]. https://inspirehep.net/record/1196429/files/soft-2004-007.pdf
    For local computers the sizes you're dealing with depend on the number of the recorded data and the analysis you are doing.
     
  9. Dec 19, 2016 #8

    mfb

    User Avatar
    2016 Award

    Staff: Mentor

    The experiments start with a 40 MHz bunch crossing rate. At ~2 MB/event (ATLAS/CMS, lower for LHCb) that is 80 TB/s. You cannot even read out such a data rate. The experiments read out a small part and look for the most interesting collisions there (mainly looking for high-energetic processes). That reduces the event rate to ~100 kHz (ATLAS/CMS) or 1 MHz (LHCb). 200 GB/s are then fed into computer farms and analyzed in more detail. Again the data is reduced to the most interesting events, ~1 kHz for ATLAS/CMS and ~10 kHz for LHCb. Those are stored permanently. The information which possible physics process happened there (e. g. "the reconstruction found two high-energetic electrons") is also stored.

    Individual analyses can then access those datasets. As an example, an analysis could look for events with two high-energetic electrons: Those might have a rate of 3 Hz during data-taking, which means you have something like 12 million events (~20 TB for ATLAS/CMS). That number varies a lot between analyses, some have just a few thousand, some have 100 millions. Those events are then processed by the computing grid, typically producing a smaller dataset (gigabytes) with just the information you care about. The GB-sized files are typically .root files and studied with C++ or python on single computers or a few computers at a time. Everything before that is code and data formats developed for the individual experiments.

    ALICE has much lower event rates, so the earlier steps are easier there, the later steps look very similar.
     
  10. Dec 19, 2016 #9
    Sounds like a similar data pipeline to what I use for mining network and host events for interesting security events, though on a much larger scale (I'm currently condensing 20-30 million filtered and stored events per day into ~30 "interesting" events). Thanks for the link @ChrisVer, I'm definitely going to read through it and maybe even get me some LHC data to play around with. Thanks guys!
     
  11. Dec 20, 2016 #10

    ChrisVer

    User Avatar
    Gold Member

    I think some datasets became available to public last year? I think if you search for it, you may find a way to access them without having to be a member of the collaboration, and they should be easy to deal with on a local machine.
     
  12. Dec 20, 2016 #11

    mfb

    User Avatar
    2016 Award

    Staff: Mentor

    Both ATLAS and CMS released older datasets, LHCb has some tiny example datasets but will publish more soon, ALICE will follow as well.
    ATLAS
    CMS

    The full released CMS dataset has 300 TB. Dealing with that on a local machine can be "tricky," but they also have smaller subsamples. I don't know the size of the ATLAS data, but I would expect it to be similar.
    Both datasets come with additional simulation samples necessary to understand the detector better.
     
  13. Jan 8, 2017 #12
    Unfortunately it looks like the TAG data (event metadata) that I'm interested in analyzing is only stored in a relational database, which is not available for download.
     
  14. Jan 12, 2017 #13
    Very nice and understandable article, thanks @mfb! I did not read it until today, but better late than never.
     
  15. Feb 4, 2017 #14
    Its good they released ATLAS datasets but that is only 8 Tev. We all know the fun stuff happens past 10 Tev. Meanwhile we anxiously await for ALICE. An event happens they just dont have capabilities to interpret dataset.
     
  16. Feb 5, 2017 #15

    ChrisVer

    User Avatar
    Gold Member

    I am not sure, but I guess that's the point. Data that have been thoroughly studied should be accessible to groups or people outside the collaboration. Data is always useful [educationally or even for researchers and some theorists].
    Also don't underestimate the 8TeV, studies are still being done on those samples... afterall, ATLAS is not just a machine dedicated to search for new physics, but it studies the Standard Model too (cross sections, polarizations etc).

    Do we? it may even start past 20TeV.

    I don't understand your latest sentence here, could you make it clearer?
     
  17. Feb 5, 2017 #16

    mfb

    User Avatar
    2016 Award

    Staff: Mentor

    The recent W mass measurement was done with 7 TeV data, and there are various studies with 7 and 8 TeV ongoing. Precision measurements take time.

    The most recent datasets are not made public yet because the collaborations that built the detectors and did all the work to gather the data also want to analyze it first. This is not particle-physics specific. A group doing some laser physics or whatever also doesn't release raw data before writing a publication about the results. Chances are good you'll never see the raw data for most experiments. In particle physics, you do.
     
  18. Jun 8, 2017 #17

    mfb

    User Avatar
    2016 Award

    Staff: Mentor

    Someone made a video about the same topic, and with nice animations.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: LHC Part 4: Searching for New Particles and Decays - Comments
  1. Particles and the LHC (Replies: 3)

Loading...