Determining the Likelihood function

daviddoria
Messages
96
Reaction score
0
I was under the impression that the likelihood function was simply the probability density function but viewing the parameter theta as the variable instead of the observations x. Ie
p(x|theta) = L(theta|x)

However, the likelihood function is no longer a probability function

See Example 1 here:
http://en.wikipedia.org/wiki/Likelihood_function

Can anyone explain this a little bit?

Thanks,
Dave
 
Physics news on Phys.org
daviddoria said:
I was under the impression that the likelihood function was simply the probability density function but viewing the parameter theta as the variable instead of the observations x.

Right. The observations are fixed in the context of a likelihood function.

daviddoria said:
However, the likelihood function is no longer a probability function

Yeah, it doesn't typically integrate to 1 (and sometimes doesn't integrate at all). Note that in most pdf's, the data and parameters play very different roles, so we shouldn't expect the function to have the same integral over each of them. Also, in maximum likelihood estimation, the parameter is assumed to be a fixed, unknown constant (and NOT a random variable), so there is no distribution assigned to it.

I could probably give a better answer if you could explain why you think the likelihood function should be a probability function.
 
maybe I used a bad example

I considered a Gaussian random variable with variance 1,

so f(x|u=3) is a Gaussian distribution centered at 3.

Then L(u|x=5) is a Gaussian distribution centered at 5.

Is that correct? That must just be a "special" case and hence a bad example?

Dave
 
daviddoria said:
maybe I used a bad example

I considered a Gaussian random variable with variance 1,

so f(x|u=3) is a Gaussian distribution centered at 3.

Then L(u|x=5) is a Gaussian distribution centered at 5.

Is that correct? That must just be a "special" case and hence a bad example?

Yeah, that's a special case, as the Gaussian mean and the data play interchangeable roles in the pdf. Try the same thing with a fixed mean and considering the variance to be the parameter, and the resulting likelihood will not be a probability function.

The problem of estimating the mean from data drawn from a unit-variance Gaussian distribution is one of the most pervasive "special cases" in statistics. The MLE for such a situation is equivalent to the MVUE, for example.
 
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Thread 'Detail of Diagonalization Lemma'
The following is more or less taken from page 6 of C. Smorynski's "Self-Reference and Modal Logic". (Springer, 1985) (I couldn't get raised brackets to indicate codification (Gödel numbering), so I use a box. The overline is assigning a name. The detail I would like clarification on is in the second step in the last line, where we have an m-overlined, and we substitute the expression for m. Are we saying that the name of a coded term is the same as the coded term? Thanks in advance.
Back
Top