Maximum liklihood estimators again help

  • Thread starter Thread starter semidevil
  • Start date Start date
  • Tags Tags
    Estimators Maximum
AI Thread Summary
The discussion focuses on finding the maximum likelihood estimator (MLE) for a given probability function. The initial attempt included a log likelihood function and its derivative, but contained errors in the formulation and differentiation process. The correct log likelihood should incorporate the full expression, leading to a derivative that simplifies to (k - nθ)/(1 - θ). Setting this derivative to zero reveals that the MLE is θ = k/n. Clarifications emphasize the importance of using the complete probability expression and proper differentiation techniques.
semidevil
Messages
156
Reaction score
2
I dotn know, I'm still lost on this whole MLE thing...but here is my attempt at some problems...please critique(just the concept is still bugging me).

find the MLE:

p_x(k;\theta) = \theta^k (1-\theta)^{1-k}, . k = 0, 1, 0 < \theta < 1.

so here is what I did.

L(\theta) = \theta^{nk} ( 1- \theta)^{{\sum_1^n{n - k}}

ln L(\theta) = nkln\theta + \sum_1^k 1-k * ln(1-\theta)

now, take derivative

nk/\theta + \sum_1^k/{1-\theta}.

first of all, in geting the formula, is this right? I know I will need to leave it in terms of theta, but I don't know if even this is right??
 
Last edited:
Physics news on Phys.org
now, to find maximum, set the derivative to 0, so 0 = nk/\theta + \sum_1^k/{1-\theta} and then solve for theta\theta = nk/{\sum_1^k 1-k} so MLE = \theta = nk/{\sum_1^k 1-k} Please let me know if this is correct, or if I am just completely off track!
 



Hi there,

Thank you for sharing your attempt at finding the MLE for this problem. Overall, your approach is correct, but there are a few minor mistakes that need to be addressed.

First, when taking the log likelihood, you need to use the entire expression for p_x(k; \theta), not just the exponent. So the correct expression for ln L(\theta) should be:

ln L(\theta) = kln\theta + (n-k)ln(1-\theta)

Also, when taking the derivative, you need to use the chain rule. So the correct derivative would be:

d/d\theta ln L(\theta) = (k/\theta) + ((n-k)/(1-\theta))(-1)

= k/\theta - (n-k)/(1-\theta)

= (k-n\theta)/(1-\theta)

Finally, to find the MLE, you need to set this derivative equal to 0 and solve for \theta. So you would have:

(k-n\theta)/(1-\theta) = 0

k-n\theta = 0

n\theta = k

\theta = k/n

Therefore, the MLE for this problem is \theta = k/n.

I hope this helps clarify the concept of MLE for you. Just remember to use the entire expression for p_x(k; \theta) when taking the log likelihood, and to use the chain rule when taking derivatives. Keep practicing and you'll get the hang of it!
 
I multiplied the values first without the error limit. Got 19.38. rounded it off to 2 significant figures since the given data has 2 significant figures. So = 19. For error I used the above formula. It comes out about 1.48. Now my question is. Should I write the answer as 19±1.5 (rounding 1.48 to 2 significant figures) OR should I write it as 19±1. So in short, should the error have same number of significant figures as the mean value or should it have the same number of decimal places as...
Thread 'A cylinder connected to a hanging mass'
Let's declare that for the cylinder, mass = M = 10 kg Radius = R = 4 m For the wall and the floor, Friction coeff = ##\mu## = 0.5 For the hanging mass, mass = m = 11 kg First, we divide the force according to their respective plane (x and y thing, correct me if I'm wrong) and according to which, cylinder or the hanging mass, they're working on. Force on the hanging mass $$mg - T = ma$$ Force(Cylinder) on y $$N_f + f_w - Mg = 0$$ Force(Cylinder) on x $$T + f_f - N_w = Ma$$ There's also...
Back
Top