Expectation of an Uniform distribution maximum likelihood estimator

Click For Summary

Discussion Overview

The discussion revolves around the maximum likelihood estimator (MLE) for a uniform distribution U(0,k), specifically focusing on determining the expectation of the estimator and its bias. Participants explore the mathematical derivation of the expectation and the properties of the maximum of a random sample drawn from this distribution.

Discussion Character

  • Exploratory
  • Technical explanation
  • Mathematical reasoning
  • Debate/contested

Main Points Raised

  • One participant asserts that the MLE for U(0,k) is the maximum value observed in the sample, which is supported by textbooks.
  • Another participant suggests that the expectation of the maximum is neither k nor k/2, prompting further exploration.
  • A participant proposes a method to find the expectation by integrating the maximum value over the uniform distribution's density function.
  • There is a discussion about the cumulative distribution function (CDF) of the maximum and how to derive the probability density function (PDF) from it.
  • One participant derives the CDF of the maximum and subsequently its PDF, leading to an expression for the expected value of the maximum, which depends on the sample size.
  • Another participant confirms the derived expectation and discusses its behavior as the sample size changes.
  • A later post introduces the concept of finding the distribution of the minimum value, leading to a question about the correct approach to derive it.
  • One participant expresses a need for clarification on finding the expectation of the second largest observation in the context of a different estimator, the Jackknife estimator.

Areas of Agreement / Disagreement

Participants generally agree on the method to derive the expectation of the maximum, but there is no consensus on the exact value of the expectation until it is fully derived. The discussion about the minimum value introduces additional uncertainty, and the question regarding the Jackknife estimator remains unresolved.

Contextual Notes

Participants note that the expectation of the maximum is influenced by the sample size and lies between k/2 and k, but the exact relationship is not fully settled. The derivation of the minimum's distribution is left open for further exploration, and the approach to finding the expectation of the second largest observation is also not resolved.

Who May Find This Useful

This discussion may be useful for students and practitioners in statistics, particularly those interested in maximum likelihood estimation, properties of uniform distributions, and advanced statistical methods like the Jackknife estimator.

artbio
Messages
13
Reaction score
0
Hi had this question on my last "Statistical Inference" exam. And I still have some doubts about it. I determined that the maximum likelihood estimator of an Uniform distribution U(0,k) is equal to the maximum value observed in the sample. That is correct. So say my textbooks. After that the bias of the estimator was demanded. And to determine the bias I need to determine its expectation first. And I am not sure if its expectation is equal to k or k/2

Is there any kind soul who can help?
Thanks.
 
Physics news on Phys.org
You can find the expectation by first writing down the density for the max, then proceeding with integration the same way you find any expectation. The answer is neither k nor k/2.
 
Last edited:
So:

<br /> E[max(x_1,...,x_n)]=\int_0^k \! max(x_1,...,x_n)\frac{1}{k} \, dx=\frac{1}{k}\int_0^k \!max(x_1,...,x_n)\, dx<br />

Is this correct?

Now I have a problem. Since the "max" is also a random variable, for which I don't know the density function. How do I integrate this?
 
Think about this (some steps missing on purpose)

1) You are dealing with a random sample of size n, so the individual x_i are independent

2) I'll call the maximum of the variables M (non-standard, but it will work) The cumulative distribution function of the maximum is, by definition,

<br /> F(t) \equiv P(M \le t)<br />

3) If the maximum value is <= t, that means all of the variables are, so

<br /> P(M \le t) = P(X_1 \le t \text{ and } X_2 \le t \text{ and } \dots X_n \le t)<br />

What does independence tell you about how the statement immediately above can be simplified?

4) Once you have an expression for the distribution function, differentiate it w.r.t t to get the density - call it f(t)

5) The expected value of the max is

<br /> \int t f(t) \, dt<br />

- integrate over the range suitable for the uniform distribution. The result will be an expression involving k and n.
 
Thanks man. You're awesome!

Let's see if I got it.

I will use the \Theta greek letter instead of a k because that was the one used on the original question.

1) Let (X_1,X_2,...X_n) be a random sample. The individual X_i are independent identically distributed random variables that follow an Uniform distribution X_i ~ U(0,\Theta)

2) The cumulative distribution function of the maximum is, by definition:

F(t) \equiv P(M \le t)

3) If the maximum value is \le t, that means all of the variables are, so:

P(M \le t) = P(X_1 \le t \; \cap \; X_2 \le t \; \cap \; \dots X_n \le t)=\prod_{i=1}^n\!P(X_i \le t)

The product follows because the individual X_i are independent random variables.

4) The probability density of the above Uniform distribution is:

f(x)=\frac{1}{\Theta} \; , \; 0 \le x \le \Theta

5) So it's distribution function is:

F(t)=0 \; , \; t \le 0
F(t)=\frac{t}{\Theta} \; , \; 0 &lt; t \le \Theta
F(t)=1 \; , \; t &gt; \Theta

6) So the product:

\prod_{i=1}^n\!P(X_i \le t)

equals the product of n distribution functions defined in point 5) which yields:

F(t)=0 \; , \; t \le 0
F(t)=(\frac{t}{\Theta})^n \; , \; 0 &lt; t \le \Theta
F(t)=1 \; , \; t &gt; \Theta

And the distribution of the maximum was found!

7) By differentiating the distribution function of the maximum one gets it's density function:

f(t)=\frac{nt^{n-1}}{\Theta^n} \; , \; 0 \le t \le \Theta

8) Finally the expected value of the max is:

E[max(X_1,X_2,...,X_n)]=E[t]=\int_{-\infty}^{+\infty}tf(t) \, dt=\int_0^{\Theta}t\frac{nt^{n-1}}{\Theta^n} \, dt=\frac{n}{\Theta^n}\int_0^{\Theta}t^n \, dt=\frac{n}{n+1}\Theta

So the result is neither \Theta nor \Theta \over 2 but a value that depends on the sample size and lies between these two. It will be exactly \Theta \over 2 when the sample size equals unity and it tends to \Theta when the sample size approaches infinity. This result is according to my original intuition so I think it is correct.

What do you think?
 
Looks good. Now you can find the bias.

Note: very soon after this type of problem (needing the distribution of the maximum in a particular case) it is customary to give a problem that requires determining the distribution of the MINIMUM value, so you may want to begin thinking about how you'd find the distribution function for that.
 
Is the minimum the same except for
<br /> \sum_{i=1}^n\!P(X_i \le t)<br />
instead of the
<br /> \prod_{i=1}^n\!P(X_i \le t)<br />
which would make
<br /> F_{M}(t) = n\frac{t}{\theta}<br />
... ?
 
no. If X_1 is the minimum, setting up

<br /> P(X_1 \le t)<br />

won't help you (for any underlying distribution). Try starting with

<br /> P(X_1 &gt; t)<br />

and then ask: if the minimum is larger than some value, what can I conclude about
all the rest of the values (compared to that same t)?
 
Aha, yes
I think I get it now :)
Thank you
 
  • #10
This thread has been very helpful to me. I have a question. I was asked to ffind the bias of the Jacknife estimator for the uniform distribution. Managed to get the expectation. I now have to find the expectation of the second largest observation, that is Xn-1. I proceeded in the same manner, but could not get the required answer. Do I have to use conditional probability?
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
Replies
1
Views
2K
  • · Replies 6 ·
Replies
6
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 19 ·
Replies
19
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K