Maximum likelihood of Poisson distribution

In summary, you can find the mean and variance of the MLE of lambda by calculating the sum of the Xi's and then dividing by the number of Xi's. The variance of the MLE of lambda goes to zero as n goes to infinity, and the MLE is consistent.
  • #1
dim&dimmer
22
0

Homework Statement


Suppose X has a Poisson distribution with parameter lambda. Given a random sample of n observations,
Find the MLE of lambda, and hat lambda.
Find the expected value and variance of hat lambda.
Show that hat lambda is a consistent estimator of lambda.

Homework Equations


PX(x) = e^-lambda.lambda^x/x!
E(X) = lambda ? or mean
var (X) = lambda ?? or mean

The Attempt at a Solution


I am really struggling with stats, hope someone can help.
I try to find likelihood function
L(x1,x2,...,xn|lambda) = e^-lambda.lambda^x/x!
= capital pi from i=1 to n, e^-lambda.lambda^xi/xi!
then I'm stuck.
Differentiating with respect to lambda,
-e^-lambda.xlambda^(x-1)/x!
is that the answer?
sorry about the notation also, is there somewhere on this site to get the symbols?

Any help greatly appreciated.
 
Physics news on Phys.org
  • #2
OK, I think I've got the first part but if my work could be checked it would be grate.(at least I'll get a reply for that if not for anything else).
for the sake of notation I'll use $ for lambda

L($|x1, x2, ... , xn) =e^-$.$^xi/xi!...e^-$.$^xn/xn!
=e^-n$.$^(sigma xi)/(xi!...xn!)

log L = -n$ + (ln$)sigma xi -ln(product xi!)

(d log L($))/d$ = -n + sigma xi/$ =0 gives
hat$ = sigma xi/n , the MLE

For part b, poisson distributions have lambda = mean = variance, so the mean and variance equal the result above.
Part c , the sample mean is a consistent estimator for lambda when the Xi are distributed Poisson, and the sample mean is = to the MLE, therefore the MLE is a consistent estimator.

Corrections are most welcome.
 
  • #3
You are correct in saying

[tex]
\hat \lambda = \frac 1 n \sum_{i=1}^n x_i
[/tex]

is the MLE of [tex] \lambda [/tex]. to obtain the mean and variance of [tex] \hat \lambda [/tex], think of the general properties that (since the [tex] x_i [/tex] are independent)

[tex]
E \hat \lambda = \frac 1 n \sum_{i=1}^n Ex_i
[/tex]

and

[tex]
\text{Var} \left( \hat \lambda \right) = \frac 1 {n^2} \sum_{i=1}^n \text{Var}(x_i)
[/tex]

simplifying the variance of [tex] \hat \lambda [/tex] should also help you show that it is a consistent estimator of [tex] \lambda [/tex]
 
  • #4
Thank you for your reply, but I'm still stuck. By using general properties do you mean that
E(xi) = sum i=1 n, (xi e^-lambda lambda^xi)/xi! that is E(x) = sum x pX(x) for discrete rv.
If so, I just seem to go round in circles
 
  • #5
No, rather these two notions.
First,

[tex]
\text{E} \left(\sum_{i=1}^n \frac{X_i} n \right) = \sum_{i=1}^n \frac{\text{E}X_i}{n}
[/tex]

and

[tex]
\tex{Var} \left(\sum_{i=1}^n \frac{X_i} n\right) = \sum_{i=1}^n \frac{\text{Var}(X_i)}{n^2}
[/tex]

You already know the mean and variance of a Poisson random variable; calculating the mean and variance of the MLE for [tex] \lambda [/tex] will allow you to conclude that the variance goes to zero, so ...
 
  • #6
Please excuse my ignorance, but is EX(i) = lambda or lambda(i).
If the latter then sample mean is sum 1 to n lambda(i)/n, and variance has n^2 as denominator.
Variance approaches 0 as n approaches infinity, and 0 variance means that the estimator is consistent.
Have I got it?
 
  • #7
Since all of the [tex] X_i [/tex] values come from the same distribution, they all have the same mean ([tex] \lambda [/tex]) and they all have the same variance. You should be able to show these points ([tex] \hat \lambda [/tex] is the MLE)

1. [tex] \text{E}\left(\hat \lambda\right) = \lambda [/tex]

2. [tex] \text{Var}\left(\hat \lambda\right) [/tex] converges to zero as [tex] n \to \infty [/tex]

With these things done you've shown (by an appeal to the result you seem to mention in your most recent post) that the MLE is consistent.
 
  • #8
You are patient!
So summing EXi gives n lambda as each EX = lambda, then dividing by n gives Ehatlambda = lambda.
similarly Var(hat lambda) = lambda/n which converges to 0.
Please tell me I'm done.
 
  • #9
Okay: You are done. :-)
Write it up neatly.
 
  • #10
Thank you very much for your help
 
  • #11
Thanks for your help and patience with this, statdad! I googled a similar question and found this thread extremely helpful.
 

1. What is the maximum likelihood of a Poisson distribution?

The maximum likelihood of a Poisson distribution is the value of lambda that maximizes the likelihood function, or the probability of obtaining the observed data. It is used to estimate the parameter lambda in a Poisson distribution.

2. How is the maximum likelihood of a Poisson distribution calculated?

The maximum likelihood of a Poisson distribution is calculated by taking the derivative of the likelihood function with respect to lambda and setting it equal to 0. This can then be solved for lambda to find the maximum likelihood estimate.

3. What is the relationship between the maximum likelihood and the Poisson distribution?

The maximum likelihood is a method for estimating the parameter lambda in a Poisson distribution. It is based on the assumption that the observed data is the most likely to occur when the parameter lambda is at its maximum value.

4. What are the assumptions made when using maximum likelihood for a Poisson distribution?

The assumptions made when using maximum likelihood for a Poisson distribution are that the data is independent and identically distributed, and that the Poisson distribution is an appropriate model for the data.

5. How is the maximum likelihood used in real-world applications?

The maximum likelihood is commonly used in real-world applications to estimate the parameters of a Poisson distribution, such as in analyzing count data in fields like biology, finance, and epidemiology. It is also used in machine learning and data analysis to fit models to observed data.

Similar threads

  • Calculus and Beyond Homework Help
Replies
8
Views
669
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
2
Replies
56
Views
3K
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
918
Replies
3
Views
1K
Back
Top