Maximum Likelihood Estimators for Uniform Distribution

Click For Summary

Discussion Overview

The discussion revolves around finding maximum likelihood estimators (MLE) for a uniform distribution defined as $X \sim U(0, \theta]$. Participants explore the likelihood function, its maximization, and the implications of sample values on the estimation of $\theta$.

Discussion Character

  • Technical explanation
  • Mathematical reasoning
  • Debate/contested

Main Points Raised

  • One participant introduces the likelihood function for the uniform distribution and asks for guidance on maximizing it.
  • Another participant calculates the likelihood function as $L(\theta) = \frac{1}{\theta^n}$ and derives its logarithm, leading to a derivative that suggests $n=0$, raising confusion about the estimator for $\theta$.
  • A subsequent reply points out that the piecewise nature of the probability density function must be considered, emphasizing that $\theta$ must be at least as large as the maximum sample value.
  • This reply questions the behavior of the likelihood function when $\theta$ exceeds the maximum sample value, suggesting further investigation is needed.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the maximum likelihood estimator for $\theta$, with ongoing confusion about the implications of the likelihood function and the role of sample values.

Contextual Notes

The discussion highlights the importance of considering the boundaries of the uniform distribution and the implications of sample values on the estimation process. There are unresolved mathematical steps regarding the behavior of the likelihood function.

Julio1
Messages
66
Reaction score
0
Find maximum likelihood estimators of an sample of size $n$ if $X\sim U(0,\theta].$

Hello MHB :)! Can any user help me please :)! I don't how follow...
 
Physics news on Phys.org
Julio said:
Find maximum likelihood estimators of an sample of size $n$ if $X\sim U(0,\theta].$

Hello MHB :)! Can any user help me please :)! I don't how follow...

Hi Julio!

We want to maximize the likelihood that some $\theta$ is the right one.

The likelihood that a certain $\theta$ is the right one given a random sample is:
$$\mathcal L(\theta; x_1, ..., x_n) = f(x_1|\theta) \times f(x_2|\theta) \times ... \times f(x_n|\theta)$$
where $f$ is the probability density function.

Since $X\sim U(0,\theta]$, $f$ is given by:
$$f(x|\theta)=\begin{cases}\frac 1 \theta&\text{if }0 < x \le \theta \\ 0 &\text{otherwise}\end{cases}$$

Can you tell for which $\theta$ the likelihood will be at its maximum?
 
Thanks I like Serena :).

Good have that the likelihood function is $L(\theta)=\dfrac{1}{\theta^n}.$ Then applying logarithm we have that

$\ln (L(\theta))=\ln(\dfrac{1}{\theta^n})=-n\ln(\theta).$ Now for derivation with respect $\theta$ we have that $\dfrac{\partial}{\partial \theta}(\ln L(\theta))=\dfrac{\partial}{\partial \theta}(-n\ln(\theta))=-\dfrac{n}{\theta}.$ Thus, match with zero have $-\dfrac{n}{\theta}=0$, i.e., $n=0$.

But why? :(, remove the parameter $\theta$?... Then I don't find an estimator for $\theta$?
 
You're welcome Julio!

What's missing in your approach is that it's not taken into account that the function is piecewise.
So we need to inspect what happens at the boundaires.

Note that any $x_i$ that is in the sample has to be $\le \theta$, because otherwise its probability is $0$.
So $\theta$ has to be at least the maximum value that is in the sample.
What happens to the likelihood if $\theta$ is bigger than that maximum value?
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 23 ·
Replies
23
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 19 ·
Replies
19
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 6 ·
Replies
6
Views
1K