Expectation of an Uniform distribution maximum likelihood estimator

Click For Summary
SUMMARY

The maximum likelihood estimator (MLE) for a Uniform distribution U(0, k) is the maximum value observed in the sample. The expected value of the maximum, E[max(X_1, ..., X_n)], is calculated using the cumulative distribution function (CDF) of the maximum, leading to the result E[max(X_1, ..., X_n)] = (n/(n+1)) * k. This value lies between k/2 and k, depending on the sample size. The discussion also touches on finding the bias of the Jackknife estimator for the uniform distribution and the expectation of the second largest observation.

PREREQUISITES
  • Understanding of Uniform distribution U(0, k)
  • Knowledge of maximum likelihood estimation (MLE)
  • Familiarity with cumulative distribution functions (CDF)
  • Basic integration techniques for probability density functions
NEXT STEPS
  • Study the derivation of the expected value for the maximum of a sample from a Uniform distribution
  • Learn about the bias of the Jackknife estimator in statistical inference
  • Explore the distribution of the minimum value in a sample and its implications
  • Investigate conditional probability techniques for finding expectations of order statistics
USEFUL FOR

Statisticians, data analysts, and students studying statistical inference, particularly those focusing on maximum likelihood estimation and order statistics.

artbio
Messages
13
Reaction score
0
Hi had this question on my last "Statistical Inference" exam. And I still have some doubts about it. I determined that the maximum likelihood estimator of an Uniform distribution U(0,k) is equal to the maximum value observed in the sample. That is correct. So say my textbooks. After that the bias of the estimator was demanded. And to determine the bias I need to determine its expectation first. And I am not sure if its expectation is equal to k or k/2

Is there any kind soul who can help?
Thanks.
 
Physics news on Phys.org
You can find the expectation by first writing down the density for the max, then proceeding with integration the same way you find any expectation. The answer is neither k nor k/2.
 
Last edited:
So:

<br /> E[max(x_1,...,x_n)]=\int_0^k \! max(x_1,...,x_n)\frac{1}{k} \, dx=\frac{1}{k}\int_0^k \!max(x_1,...,x_n)\, dx<br />

Is this correct?

Now I have a problem. Since the "max" is also a random variable, for which I don't know the density function. How do I integrate this?
 
Think about this (some steps missing on purpose)

1) You are dealing with a random sample of size n, so the individual x_i are independent

2) I'll call the maximum of the variables M (non-standard, but it will work) The cumulative distribution function of the maximum is, by definition,

<br /> F(t) \equiv P(M \le t)<br />

3) If the maximum value is <= t, that means all of the variables are, so

<br /> P(M \le t) = P(X_1 \le t \text{ and } X_2 \le t \text{ and } \dots X_n \le t)<br />

What does independence tell you about how the statement immediately above can be simplified?

4) Once you have an expression for the distribution function, differentiate it w.r.t t to get the density - call it f(t)

5) The expected value of the max is

<br /> \int t f(t) \, dt<br />

- integrate over the range suitable for the uniform distribution. The result will be an expression involving k and n.
 
Thanks man. You're awesome!

Let's see if I got it.

I will use the \Theta greek letter instead of a k because that was the one used on the original question.

1) Let (X_1,X_2,...X_n) be a random sample. The individual X_i are independent identically distributed random variables that follow an Uniform distribution X_i ~ U(0,\Theta)

2) The cumulative distribution function of the maximum is, by definition:

F(t) \equiv P(M \le t)

3) If the maximum value is \le t, that means all of the variables are, so:

P(M \le t) = P(X_1 \le t \; \cap \; X_2 \le t \; \cap \; \dots X_n \le t)=\prod_{i=1}^n\!P(X_i \le t)

The product follows because the individual X_i are independent random variables.

4) The probability density of the above Uniform distribution is:

f(x)=\frac{1}{\Theta} \; , \; 0 \le x \le \Theta

5) So it's distribution function is:

F(t)=0 \; , \; t \le 0
F(t)=\frac{t}{\Theta} \; , \; 0 &lt; t \le \Theta
F(t)=1 \; , \; t &gt; \Theta

6) So the product:

\prod_{i=1}^n\!P(X_i \le t)

equals the product of n distribution functions defined in point 5) which yields:

F(t)=0 \; , \; t \le 0
F(t)=(\frac{t}{\Theta})^n \; , \; 0 &lt; t \le \Theta
F(t)=1 \; , \; t &gt; \Theta

And the distribution of the maximum was found!

7) By differentiating the distribution function of the maximum one gets it's density function:

f(t)=\frac{nt^{n-1}}{\Theta^n} \; , \; 0 \le t \le \Theta

8) Finally the expected value of the max is:

E[max(X_1,X_2,...,X_n)]=E[t]=\int_{-\infty}^{+\infty}tf(t) \, dt=\int_0^{\Theta}t\frac{nt^{n-1}}{\Theta^n} \, dt=\frac{n}{\Theta^n}\int_0^{\Theta}t^n \, dt=\frac{n}{n+1}\Theta

So the result is neither \Theta nor \Theta \over 2 but a value that depends on the sample size and lies between these two. It will be exactly \Theta \over 2 when the sample size equals unity and it tends to \Theta when the sample size approaches infinity. This result is according to my original intuition so I think it is correct.

What do you think?
 
Looks good. Now you can find the bias.

Note: very soon after this type of problem (needing the distribution of the maximum in a particular case) it is customary to give a problem that requires determining the distribution of the MINIMUM value, so you may want to begin thinking about how you'd find the distribution function for that.
 
Is the minimum the same except for
<br /> \sum_{i=1}^n\!P(X_i \le t)<br />
instead of the
<br /> \prod_{i=1}^n\!P(X_i \le t)<br />
which would make
<br /> F_{M}(t) = n\frac{t}{\theta}<br />
... ?
 
no. If X_1 is the minimum, setting up

<br /> P(X_1 \le t)<br />

won't help you (for any underlying distribution). Try starting with

<br /> P(X_1 &gt; t)<br />

and then ask: if the minimum is larger than some value, what can I conclude about
all the rest of the values (compared to that same t)?
 
Aha, yes
I think I get it now :)
Thank you
 
  • #10
This thread has been very helpful to me. I have a question. I was asked to ffind the bias of the Jacknife estimator for the uniform distribution. Managed to get the expectation. I now have to find the expectation of the second largest observation, that is Xn-1. I proceeded in the same manner, but could not get the required answer. Do I have to use conditional probability?
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 11 ·
Replies
11
Views
3K
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 19 ·
Replies
19
Views
2K
Replies
1
Views
4K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 23 ·
Replies
23
Views
4K