Determining the Likelihood function

  • Context: Undergrad 
  • Thread starter Thread starter daviddoria
  • Start date Start date
  • Tags Tags
    Function Likelihood
Click For Summary

Discussion Overview

The discussion revolves around the nature of the likelihood function in statistics, particularly its distinction from probability density functions. Participants explore the implications of treating parameters as variables and the roles of observations in this context.

Discussion Character

  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • Some participants propose that the likelihood function can be viewed as the probability density function with the parameter theta as the variable instead of the observations x.
  • Others clarify that the likelihood function does not behave like a probability function, as it typically does not integrate to 1 and may not integrate at all.
  • One participant suggests that in maximum likelihood estimation, the parameter is treated as a fixed, unknown constant rather than a random variable, which affects its interpretation.
  • A participant questions whether using a Gaussian random variable as an example is appropriate, noting that in this case, the likelihood function appears to behave similarly to a probability function.
  • Another participant agrees that the Gaussian example is a special case where the roles of the mean and data are interchangeable, suggesting that exploring other parameters may yield different results.

Areas of Agreement / Disagreement

Participants generally agree on the distinction between likelihood functions and probability functions, but there is ongoing debate about specific examples and the implications of treating parameters as variables.

Contextual Notes

The discussion highlights the limitations of using specific examples, such as the Gaussian distribution, to illustrate the properties of likelihood functions, as these examples may not represent the general behavior of likelihoods across different distributions.

daviddoria
Messages
96
Reaction score
0
I was under the impression that the likelihood function was simply the probability density function but viewing the parameter theta as the variable instead of the observations x. Ie
p(x|theta) = L(theta|x)

However, the likelihood function is no longer a probability function

See Example 1 here:
http://en.wikipedia.org/wiki/Likelihood_function

Can anyone explain this a little bit?

Thanks,
Dave
 
Physics news on Phys.org
daviddoria said:
I was under the impression that the likelihood function was simply the probability density function but viewing the parameter theta as the variable instead of the observations x.

Right. The observations are fixed in the context of a likelihood function.

daviddoria said:
However, the likelihood function is no longer a probability function

Yeah, it doesn't typically integrate to 1 (and sometimes doesn't integrate at all). Note that in most pdf's, the data and parameters play very different roles, so we shouldn't expect the function to have the same integral over each of them. Also, in maximum likelihood estimation, the parameter is assumed to be a fixed, unknown constant (and NOT a random variable), so there is no distribution assigned to it.

I could probably give a better answer if you could explain why you think the likelihood function should be a probability function.
 
maybe I used a bad example

I considered a Gaussian random variable with variance 1,

so f(x|u=3) is a Gaussian distribution centered at 3.

Then L(u|x=5) is a Gaussian distribution centered at 5.

Is that correct? That must just be a "special" case and hence a bad example?

Dave
 
daviddoria said:
maybe I used a bad example

I considered a Gaussian random variable with variance 1,

so f(x|u=3) is a Gaussian distribution centered at 3.

Then L(u|x=5) is a Gaussian distribution centered at 5.

Is that correct? That must just be a "special" case and hence a bad example?

Yeah, that's a special case, as the Gaussian mean and the data play interchangeable roles in the pdf. Try the same thing with a fixed mean and considering the variance to be the parameter, and the resulting likelihood will not be a probability function.

The problem of estimating the mean from data drawn from a unit-variance Gaussian distribution is one of the most pervasive "special cases" in statistics. The MLE for such a situation is equivalent to the MVUE, for example.
 

Similar threads

  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 0 ·
Replies
0
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 9 ·
Replies
9
Views
3K