How Do You Calculate Maximum Likelihood Estimates for Different Distributions?

probabilityst
Messages
3
Reaction score
0
I actually have two questions, both of which are on the same topic

Homework Statement


Consider X = number of independent trials until an event A occurs. Show that X has the probability mass function f(x) = (1-p)^x p, x = 1,2,3,..., where p is the probability that A occurs in a single trial. Also show that the maximum likelihood estimate of p(^p) is 1/(sample mean), where (sample mean) = (sum from 1 to n of x)/n. This experiment is referred to as the negative binomial experiment

2. Find the maximum likelihood estimate for the paramter mu of the normal distribution with known variance sigma^2 = sigma_0_^2
_0_ is subscript

Homework Equations


The Attempt at a Solution


Um, I'm not actually sure how to go about solving this problem. Is maximum likelihood estimate similar to minimum variance unbiased estimator?

Any help is greatly appreciated, thank you
 
Physics news on Phys.org
No. The "maximum likelihood estimator" for a parameter is the value of the parameter that gives the largest possible probability for the actual sample.
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...

Similar threads

Replies
7
Views
2K
Replies
6
Views
2K
Replies
2
Views
4K
Replies
11
Views
2K
Replies
16
Views
2K
Back
Top