B What is the collaborative process behind CERN's particle result modeling?

  • B
  • Thread starter Thread starter scienceboy1
  • Start date Start date
  • Tags Tags
    Cern Model
scienceboy1
Messages
2
Reaction score
0
Hi all,
I was wondering how CERN models their particle results as we see them in images.
As I understand it, Heisenberg's Uncertainty Principle tells us that we are unable to precisely observe subatomic or fundamental particles, as the energy would cause its velocity or position to alter. How, then, can they precisely model the results of their experiments? What methods do they use?

Thanks in advance.

P.S. Apologies if any of my information is wrong, I'm still in high school so my knowledge is horrifyingly limited!
If I followed an incorrect format or posted in the wrong section, please let me know - this is my first time here.
 
Physics news on Phys.org
Alex Lehm said:
I was wondering how CERN models their particle results as we see them in images.
Which images? Event displays like these?
The uncertainty relation is completely negligible for these tracks - less than the size of an atom while the pictures show tracks with lengths of meters. The measurement uncertainties of the detectors is many orders of magnitude larger than the fundamental limit given by the uncertainty principle.

The measurement uncertainty of the detectors can be a challenge for the experiments. As an example, if you reconstruct the decay of a Higgs boson to two photons, you use the measured properties of the photons to calculate the Higgs boson mass. A hypothetical perfect detector would always get the same value within ~0.004% (the natural uncertainty of this value), in practice the measurements vary by about 1%. That makes it harder to see an excess of events at the Higgs mass - you need much larger datasets than you would with a "perfect" detector. Here is an example graph and some more details.
 
  • Like
Likes Imager, vanhees71, dlgoff and 1 other person
mfb said:
Which images? Event displays Like these?
The uncertainty relation is completely negligible for these tracks - less than the size of an atom while the pictures show tracks with lengths of meters. The measurement uncertainties of the detectors is many orders of magnitude larger than the fundamental limit given by the uncertainty principle.

The measurement uncertainty of the detectors can be a challenge for the experiments. As an example, if you reconstruct the decay of a Higgs boson to two photons, you use the measured properties of the photons to calculate the Higgs boson mass. A hypothetical perfect detector would always get the same value within ~0.004% (the natural uncertainty of this value), in practice the measurements vary by about 1%. That makes it harder to see an excess of events at the Higgs mass - you need much larger datasets than you would with a "perfect" detector. Here is an example graph and some more details.

Thank you
 
Alex Lehm said:
Hi all,
I was wondering how CERN models their particle results as we see them in images.

Since you're still in high school, then there is also another thing that you should learn. The work being done at CERN are not done only by CERN employees. All of the major scientific efforts at CERN (and at Fermilab, NASA, Super Kamiokande, etc...) are done by collaborators from institutions all over the world. Look at this, for example:

https://arxiv.org/pdf/1808.00336.pdf

Scroll down and look at the institutions that are represented within the "ATLAS collaboration". What percentage of these people are actually working for CERN directly?

I'm pointing this out because a lot of people, especially outside of physics and those who are still in high school, often do not realize that major scientific projects, while done under the banner of one organization, are often done by people from all over the world. You do not need to be working for CERN to work AT CERN. This applies to all the user facilities at various major scientific laboratories all around the world.

This is a very common misconception, and I thought that this is a good opportunity to highlight it since it is obviously happening here.

Zz.
 
  • Like
Likes vanhees71, odietrich, mfb and 1 other person
Toponium is a hadron which is the bound state of a valance top quark and a valance antitop quark. Oversimplified presentations often state that top quarks don't form hadrons, because they decay to bottom quarks extremely rapidly after they are created, leaving no time to form a hadron. And, the vast majority of the time, this is true. But, the lifetime of a top quark is only an average lifetime. Sometimes it decays faster and sometimes it decays slower. In the highly improbable case that...
I'm following this paper by Kitaev on SL(2,R) representations and I'm having a problem in the normalization of the continuous eigenfunctions (eqs. (67)-(70)), which satisfy \langle f_s | f_{s'} \rangle = \int_{0}^{1} \frac{2}{(1-u)^2} f_s(u)^* f_{s'}(u) \, du. \tag{67} The singular contribution of the integral arises at the endpoint u=1 of the integral, and in the limit u \to 1, the function f_s(u) takes on the form f_s(u) \approx a_s (1-u)^{1/2 + i s} + a_s^* (1-u)^{1/2 - i s}. \tag{70}...

Similar threads

Back
Top