Statistical Analysis in High-Energy Physics: Methods and Applications

  • Context: Graduate 
  • Thread starter Thread starter iibewegung
  • Start date Start date
  • Tags Tags
    Hep Statistical
Click For Summary
SUMMARY

The discussion focuses on statistical analysis methods used in high-energy physics (HEP), particularly in estimating parameters of the standard model. Key techniques mentioned include maximum likelihood estimation, Bayesian methods, and advanced machine learning approaches such as neural networks and decision trees. Prominent collaborations like ATLAS, CMS, CDF, and DØ are involved in analyzing collider data, often requiring approval from committees for their analyses. Resources such as the PDG and UTFit provide comprehensive insights into various statistical approaches and results in the field.

PREREQUISITES
  • Understanding of maximum likelihood estimation in statistical analysis
  • Familiarity with Bayesian statistics and its applications
  • Knowledge of machine learning techniques, specifically neural networks and decision trees
  • Awareness of high-energy physics experimental collaborations like ATLAS and CMS
NEXT STEPS
  • Research the maximum likelihood estimation method in detail
  • Explore Bayesian methods in statistical analysis for high-energy physics
  • Learn about machine learning applications in particle physics, focusing on neural networks
  • Investigate the role of the Particle Data Group (PDG) in compiling experimental results
USEFUL FOR

Researchers, physicists, and data analysts involved in high-energy physics, particularly those interested in statistical methods for analyzing collider data and estimating parameters of the standard model.

iibewegung
Messages
14
Reaction score
0
Hi,

Could anyone tell me what type of statistical test is used to estimate the parameters of the standard model?

I hear many particle physicists say, eg. "we have a 95% confidence that this quark mass falls between A and B" and what immediately comes to mind are the methods of statistical hypothesis testing like... z test or t test.

Any explanations or comments about the statistical analysis being done on the data that we get from colliders?
Thanks in advance!
 
Physics news on Phys.org
When I was a grad student in HEP many years ago, we commonly used the maximum likelihood method. Don't press me on the details, though. It's been a long time since I did this stuff. :redface:
 
Different groups can and do use different methods. I (as a theorist) imagine that debating what methods to use is a big part of life in the major experimental collaborations (ATLAS, CMS, CDF, DØ, etc.), which are the ones who directly analyze collider data. I believe standard procedure is for smaller working groups to perform analyses that they then try to get approved by various committees. As signals get harder to see, more elaborate methods (neural nets, decision trees) have to be used.

http://physics.bu.edu/~tulika/Teaching/Spring09/lectures/Lecture8.pdf" are some slides you may find interesting.

At a higher level, various groups (http://www.utfit.org/", etc.) perform statistical analyses of many experimental results, again using different methods. (The UTFit group likes Bayesian methods, for instance.)

The http://pdg.lbl.gov/" is the one-stop spot for compiling and comparing the various approaches and their results.
 
Last edited by a moderator:

Similar threads

  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 13 ·
Replies
13
Views
2K
  • · Replies 26 ·
Replies
26
Views
4K
  • · Replies 5 ·
Replies
5
Views
4K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 9 ·
Replies
9
Views
3K
  • Poll Poll
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 49 ·
2
Replies
49
Views
13K