Maximum Likelihood Fitting normalization (Bevington)

In summary, if you are trying to estimate a and b separately while using the normalization factor in the fitting process, you can use the extended likelihood method by defining the likelihood function as described above and solving for a and b using the partial derivatives.
  • #1
stoked
1
0
Hi,
I'm posting this in this particular forum because, though this's a statistics question, my application is in high energy.

My question is regarding a problem in Bevington's book (Data Reduction and Error Analysis..., Page 193, Ex. 10.1), but I'll give a general description here...

Say you want to fit to a scattering distribution function: f = (a + b cos2 theta) using the likelihood method. cos[tex]\theta[/tex] ranging from -1 to 1, you would get a normalization integral, norm = 2(a + b/3). However, what goes into the actual code is the normalized fuction (a + b cos2 theta) / 2(a + b/3), so that the fitter is only sensitive to (a/b) and not a and b separately. So if you were asked to estimate a and b (as does Bevington's problem)...what do you do?

I am aware of the "extended" likelihood method. Is that applicable here? If so, how?

Thanks a bunch.
 
Physics news on Phys.org
  • #2
Yes, the extended likelihood method is applicable here. The extended maximum likelihood method allows you to estimate the parameters (a and b) separately while incorporating the normalization factor in the fitting procedure. To do this, you will need to define the likelihood function as: L(a,b) = product(f(theta_i | a,b)/norm)Where f(theta_i | a,b) is the scattering distribution function, and norm is the normalization factor. The extended maximum likelihood method then involves taking the partial derivatives of L(a,b) with respect to a and b, setting them equal to zero, and solving for a and b. This will give you the maximum likelihood estimates for a and b.
 
  • #3



Hi there,

Thank you for your question. The Maximum Likelihood Fitting normalization is a commonly used method in statistics to estimate the parameters of a distribution function. In this case, you are fitting a scattering distribution function using the likelihood method. The normalization integral, as you mentioned, is necessary to ensure that the fitted function is normalized to 1. However, in order to estimate the individual parameters a and b, you need to consider the normalized function (a + b cos2 theta) / 2(a + b/3). This is because the fitter is only sensitive to the ratio of a and b, not the individual values themselves.

One approach to estimating a and b separately is to use the "extended" likelihood method, as you mentioned. This method takes into account the normalization factor in the likelihood function, allowing you to estimate a and b separately. Another approach is to use a non-linear least squares fitting method, such as the Levenberg-Marquardt algorithm, which can handle non-linear functions and give better estimates for a and b compared to the Maximum Likelihood Fitting normalization method.

I hope this helps answer your question. Best of luck with your fitting process!
 

1. What is maximum likelihood fitting normalization?

Maximum likelihood fitting normalization is a statistical method used to determine the parameters of a mathematical model that best fit a given set of data. It involves finding the values of the model's parameters that maximize the likelihood of the observed data occurring.

2. How is maximum likelihood fitting normalization used in Bevington analysis?

In Bevington analysis, maximum likelihood fitting normalization is used to determine the best-fit parameters for a given model by comparing the likelihood of the observed data for different parameter values. This allows for the determination of the most likely model parameters and their uncertainties.

3. What is the difference between maximum likelihood fitting normalization and other fitting methods?

The main difference between maximum likelihood fitting normalization and other fitting methods is that it takes into account the likelihood of the observed data occurring for different parameter values, rather than just minimizing the sum of squared residuals. This can result in a more accurate determination of the model parameters.

4. How do you interpret the results of maximum likelihood fitting normalization?

The results of maximum likelihood fitting normalization are typically presented as the best-fit parameters for the model, along with their associated uncertainties. These values can then be used to make predictions about the data or to compare different models.

5. Are there any limitations to using maximum likelihood fitting normalization?

Like any statistical method, maximum likelihood fitting normalization has its limitations. It assumes that the observed data follows a specific probability distribution, which may not always be the case. It also requires a good understanding of the underlying model and its parameters in order to interpret the results accurately.

Similar threads

  • Programming and Computer Science
Replies
9
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
901
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
3K
  • Set Theory, Logic, Probability, Statistics
Replies
8
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
918
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
955
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
Back
Top