# Maximum likelihood of Poisson distribution

1. Aug 20, 2008

### dim&dimmer

1. The problem statement, all variables and given/known data
Suppose X has a Poisson distribution with parameter lambda. Given a random sample of n observations,
Find the MLE of lambda, and hat lambda.
Find the expected value and variance of hat lambda.
Show that hat lambda is a consistent estimator of lambda.

2. Relevant equations
PX(x) = e^-lambda.lambda^x/x!
E(X) = lambda ??? or mean
var (X) = lambda ?? or mean

3. The attempt at a solution
I am really struggling with stats, hope someone can help.
I try to find likelihood function
L(x1,x2,...,xn|lambda) = e^-lambda.lambda^x/x!
= capital pi from i=1 to n, e^-lambda.lambda^xi/xi!
then I'm stuck.
Differentiating with respect to lambda,
-e^-lambda.xlambda^(x-1)/x!
sorry about the notation also, is there somewhere on this site to get the symbols?

Any help greatly appreciated.

2. Aug 20, 2008

### dim&dimmer

OK, I think Ive got the first part but if my work could be checked it would be grate.(at least I'll get a reply for that if not for anything else).
for the sake of notation I'll use $for lambda L($|x1, x2, ... , xn) =e^-$.$^xi/xi!...e^-$.$^xn/xn!
=e^-n$.$^(sigma xi)/(xi!...xn!)

log L = -n$+ (ln$)sigma xi -ln(product xi!)

(d log L($))/d$ = -n + sigma xi/$=0 gives hat$ = sigma xi/n , the MLE

For part b, poisson distributions have lambda = mean = variance, so the mean and variance equal the result above.
Part c , the sample mean is a consistent estimator for lambda when the Xi are distributed Poisson, and the sample mean is = to the MLE, therefore the MLE is a consistent estimator.

Corrections are most welcome.

3. Aug 20, 2008

You are correct in saying

$$\hat \lambda = \frac 1 n \sum_{i=1}^n x_i$$

is the MLE of $$\lambda$$. to obtain the mean and variance of $$\hat \lambda$$, think of the general properties that (since the $$x_i$$ are independent)

$$E \hat \lambda = \frac 1 n \sum_{i=1}^n Ex_i$$

and

$$\text{Var} \left( \hat \lambda \right) = \frac 1 {n^2} \sum_{i=1}^n \text{Var}(x_i)$$

simplifying the variance of $$\hat \lambda$$ should also help you show that it is a consistent estimator of $$\lambda$$

4. Aug 21, 2008

### dim&dimmer

Thank you for your reply, but I'm still stuck. By using general properties do you mean that
E(xi) = sum i=1 n, (xi e^-lambda lambda^xi)/xi! that is E(x) = sum x pX(x) for discrete rv.
If so, I just seem to go round in circles

5. Aug 21, 2008

No, rather these two notions.
First,

$$\text{E} \left(\sum_{i=1}^n \frac{X_i} n \right) = \sum_{i=1}^n \frac{\text{E}X_i}{n}$$

and

$$\tex{Var} \left(\sum_{i=1}^n \frac{X_i} n\right) = \sum_{i=1}^n \frac{\text{Var}(X_i)}{n^2}$$

You already know the mean and variance of a Poisson random variable; calculating the mean and variance of the MLE for $$\lambda$$ will allow you to conclude that the variance goes to zero, so ...

6. Aug 21, 2008

### dim&dimmer

Please excuse my ignorance, but is EX(i) = lambda or lambda(i).
If the latter then sample mean is sum 1 to n lambda(i)/n, and variance has n^2 as denominator.
Variance approaches 0 as n approaches infinity, and 0 variance means that the estimator is consistent.
Have I got it?

7. Aug 21, 2008

Since all of the $$X_i$$ values come from the same distribution, they all have the same mean ($$\lambda$$) and they all have the same variance. You should be able to show these points ($$\hat \lambda$$ is the MLE)

1. $$\text{E}\left(\hat \lambda\right) = \lambda$$

2. $$\text{Var}\left(\hat \lambda\right)$$ converges to zero as $$n \to \infty$$

With these things done you've shown (by an appeal to the result you seem to mention in your most recent post) that the MLE is consistent.

8. Aug 21, 2008

### dim&dimmer

You are patient!
So summing EXi gives n lambda as each EX = lambda, then dividing by n gives Ehatlambda = lambda.
similarly Var(hat lambda) = lambda/n which converges to 0.

9. Aug 21, 2008

Okay: You are done. :-)
Write it up neatly.

10. Aug 21, 2008

### dim&dimmer

Thank you very much for your help

11. Nov 24, 2009