How Do You Calculate Maximum Likelihood Estimates for Different Distributions?

Click For Summary
SUMMARY

The discussion focuses on calculating Maximum Likelihood Estimates (MLE) for different distributions, specifically the negative binomial distribution and the normal distribution. The probability mass function for the negative binomial experiment is defined as f(x) = (1-p)^x p, where p represents the probability of event A occurring in a single trial. The maximum likelihood estimate for p is calculated as 1/(sample mean). Additionally, the discussion addresses finding the MLE for the parameter mu of the normal distribution with a known variance sigma^2.

PREREQUISITES
  • Understanding of probability mass functions, particularly for the negative binomial distribution.
  • Familiarity with Maximum Likelihood Estimation (MLE) concepts and calculations.
  • Knowledge of normal distribution properties, including variance and mean.
  • Basic statistical concepts such as independent trials and sample means.
NEXT STEPS
  • Study the derivation of the negative binomial distribution and its applications.
  • Learn about Maximum Likelihood Estimation techniques for various statistical distributions.
  • Explore the properties and applications of the normal distribution with known variance.
  • Investigate the relationship between MLE and other estimation methods, such as minimum variance unbiased estimators.
USEFUL FOR

Statisticians, data analysts, and students studying statistical inference who are interested in understanding Maximum Likelihood Estimation for different probability distributions.

probabilityst
Messages
3
Reaction score
0
I actually have two questions, both of which are on the same topic

Homework Statement


Consider X = number of independent trials until an event A occurs. Show that X has the probability mass function f(x) = (1-p)^x p, x = 1,2,3,..., where p is the probability that A occurs in a single trial. Also show that the maximum likelihood estimate of p(^p) is 1/(sample mean), where (sample mean) = (sum from 1 to n of x)/n. This experiment is referred to as the negative binomial experiment

2. Find the maximum likelihood estimate for the paramter mu of the normal distribution with known variance sigma^2 = sigma_0_^2
_0_ is subscript

Homework Equations


The Attempt at a Solution


Um, I'm not actually sure how to go about solving this problem. Is maximum likelihood estimate similar to minimum variance unbiased estimator?

Any help is greatly appreciated, thank you
 
Physics news on Phys.org
No. The "maximum likelihood estimator" for a parameter is the value of the parameter that gives the largest possible probability for the actual sample.
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
5K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K