How do I normalize the likelihood distribution?

In summary, a new user to using statistics is trying to regenerate the results of a paper in which a theoretical parameter is constrained. They have calculated the chi squared and plotted the likelihood function but are unsure of the normalization constants they are missing. The conversation discusses the use of reduced chi squared and the need to manually integrate the probability distribution to determine the normalization. The unknown parameters in the chi square are never solved for, just their probability distributions. The user tries various methods to normalize the graph but ultimately it is concluded that the authors of the paper may have just scaled the graph to one.
  • #1
aymer
16
0
Hey everyone,

i am new to using statistics and have come across a problem. I am trying to regenerate the result of a paper in which a theoretical parameter is constrained.
I have calculated the chi squared and plotted the likelihood function (exp[-chisquared])to see what value of the parameter has the highest likelihood.
I am getting almost the same value as the paper but my graph is not scaled like it. I would like the maximum value of likelihood to be 1. What normalization constants am i missing?

Also in the likelihood function do we use the reduced chisquared?
 
Space news on Phys.org
  • #2
Well, first of all, there may be a factor of two error. The way the Chi square is usually defined, you should be dividing it by two in the exponent to the probability distribution.

That said, the key point here is that for most problems, the normalization is simply not known when the Chi square is generated, so you have to manually integrate the probability distribution to determine the normalization. If your distribution is Gaussian, you can do this analytically, though for real distributions we typically make use of Markov-Chain Monte Carlo to perform the integration numerically.

At any rate, you can read up on the multivariate normal distribution here:
http://en.wikipedia.org/wiki/Multivariate_normal_distribution
 
  • #3
Thanx,
I thought of integrating the likelihood and equating it to find the normalization constant but the problem is my chi-squared itself is a function of an unknown parameter,the value of which I want to find. So we have two unknowns-the parameter and the normalization constant and only one equation!
 
  • #4
aymer said:
Thanx,
I thought of integrating the likelihood and equating it to find the normalization constant but the problem is my chi-squared itself is a function of an unknown parameter,the value of which I want to find. So we have two unknowns-the parameter and the normalization constant and only one equation!
Well, no, the unknown parameter is left variable like so:

[tex]\int P(x) dx = 1[/tex]

Where [itex]x[/itex] here is the set of all parameters in the chi square, and the integral is over the entire available parameter space. If your chi square is Gaussian-distributed, this normalization is relatively straightforward. If not, you have to actually perform the integral numerically.

Basically, the unknown input parameters to the chi square are never solved for, we merely find their probability distributions.
 
  • #5
hii,
i tried doing the normalization as you suggested by doing

[A*integral(exp[-chisquared/2])] over the parameter(-inf,inf) and equated it to 1 to calculate A.
then i plotted A*exp[-chisquared/2] vs the parameter. but still it is not normalized to 1. the maximum value the function takes is nearly 2.
what am i doing wrong?
 
  • #6
i also tried normalizing each event(i have 17 data points) to 1 and find corresponding normalization constants. Then i plotted (product of constants)*exp[-chisquared/2],but even that is not scaled to 1.
 
  • #7
aymer said:
hii,
i tried doing the normalization as you suggested by doing

[A*integral(exp[-chisquared/2])] over the parameter(-inf,inf) and equated it to 1 to calculate A.
then i plotted A*exp[-chisquared/2] vs the parameter. but still it is not normalized to 1. the maximum value the function takes is nearly 2.
what am i doing wrong?
What do you mean it isn't normalized to one? If the integral of the probability distribution is equal to one, then it is normalized to one. Depending upon the width of the distribution, it could easily take on values at any given point that are much larger than one. This isn't a problem.

Normalizing the individual events changes the problem significantly, and should not be done.
 
  • #8
Chalnoth said:
What do you mean it isn't normalized to one? If the integral of the probability distribution is equal to one, then it is normalized to one. Depending upon the width of the distribution, it could easily take on values at any given point that are much larger than one. This isn't a problem.

Normalizing the individual events changes the problem significantly, and should not be done.

Thanx,

So I guess what i have done is correct and the authors of the paper must have just scaled the graph to one somehow.I was worried because I was getting the peak at the same point and my graph intersects the x-axis at the same coordinates as the paper.
 
  • #9
aymer said:
Thanx,

So I guess what i have done is correct and the authors of the paper must have just scaled the graph to one somehow.I was worried because I was getting the peak at the same point and my graph intersects the x-axis at the same coordinates as the paper.
Right, so, a lot of the time people don't care to normalize probability distributions. So if they simply set the peak equal to one, that would explain their graph.
 

What is a likelihood distribution?

A likelihood distribution is a mathematical concept used in statistics to represent the probability of different outcomes occurring in a given experiment or study. It is a function that associates each possible outcome with a probability of that outcome occurring.

How is a likelihood distribution different from a probability distribution?

While both likelihood and probability distributions represent the likelihood of different outcomes, they serve different purposes. A likelihood distribution is used to estimate the parameters of a population, while a probability distribution is used to calculate the probability of a specific outcome occurring.

What are the key properties of a likelihood distribution?

The key properties of a likelihood distribution include the shape, location, and scale of the distribution, as well as its moments (mean, variance, skewness, etc.). These properties can be used to describe and compare different distributions.

What is the likelihood principle and how does it relate to likelihood distributions?

The likelihood principle states that the only relevant information for inference is contained in the likelihood function. In other words, the likelihood function captures all the information we need to make probabilistic statements about the data and the underlying population. Likelihood distributions are used to represent this function.

What are some common examples of likelihood distributions?

Some common examples of likelihood distributions include the normal distribution, binomial distribution, poisson distribution, and exponential distribution. These distributions are commonly used in statistical modeling and have well-defined properties that make them useful for a variety of applications.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
16
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
692
  • High Energy, Nuclear, Particle Physics
Replies
1
Views
829
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
864
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
815
  • Set Theory, Logic, Probability, Statistics
Replies
11
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
14
Views
2K
Back
Top