MHB Log likelihood and Maximum likelihood

AI Thread Summary
To find the log-likelihood function for a binomial distribution with four trials, start by creating the joint probability distribution by multiplying the individual probability density functions, resulting in L = p²(1-p)². Taking the logarithm gives the log-likelihood function l = 2ln p + 2ln(1-p). The sample mean x̄ is identified as 1/2, leading to the identities 2 = 4x̄ and 2 = 4 - 4x̄. Substituting these into the log-likelihood function yields l = 4x̄ln p + (4 - 4x̄)ln(1-p), from which the derivative can be taken to find the desired result. This process clarifies the origin of the coefficient 4 in the derivative.
cajswn
Messages
1
Reaction score
0
Screenshot 2020-11-08 at 21.03.30.png

I'm not sure how to get this first derivative (mainly where does the 4 come from?)
I know x̄ is the sample mean (which I think is 1/2?)
Can someone suggest where to start with finding the log-likelihood?

I know the mass function of a binomial distribution is:
Screenshot 2020-11-08 at 21.05.01.png


Thanks!
 
Last edited:
Physics news on Phys.org
Hi cajswn,

To determine the likelihood function we need to create the joint probability distribution. Since the 4 individual trials of the binomial process are independent, we simply multiply the 4 individual probability density functions to get the needed joint probability distribution: $$L = p^{2}(1-p)^{2}.$$ Now take the logarithm to get the log-likelihood function: $$l = 2\ln p + 2\ln (1-p).$$ Since $\bar{x} = 1/2$ we have $2=4\bar{x}$ and $2 = 4-4\bar{x}$. Using these two identities we get: $$l = 4\bar{x}\ln p +(4-4\bar{x})\ln (1-p).$$ From here, take the derivative of $l$ with respect to $p$ to get the result you're looking for.
 

Similar threads

Replies
16
Views
2K
Replies
3
Views
2K
Replies
11
Views
3K
Replies
2
Views
2K
Replies
2
Views
1K
Back
Top