MHB Maximum Likelihood Estimators for Uniform Distribution

Click For Summary
SUMMARY

The discussion focuses on finding the maximum likelihood estimators (MLE) for a uniform distribution defined as $X \sim U(0, \theta]$. The likelihood function is established as $L(\theta) = \frac{1}{\theta^n}$, leading to the logarithmic form $\ln(L(\theta)) = -n \ln(\theta)$. The key conclusion is that the maximum likelihood estimator for $\theta$ must be at least the maximum value observed in the sample, as any sample value exceeding $\theta$ results in a likelihood of zero.

PREREQUISITES
  • Understanding of maximum likelihood estimation (MLE)
  • Familiarity with uniform distributions, specifically $U(0, \theta]$
  • Knowledge of probability density functions (PDFs)
  • Basic calculus, particularly differentiation and logarithmic functions
NEXT STEPS
  • Study the properties of uniform distributions and their applications in statistics
  • Learn about the method of moments as an alternative to maximum likelihood estimation
  • Explore the implications of piecewise functions in statistical modeling
  • Investigate the behavior of likelihood functions in boundary conditions
USEFUL FOR

Statisticians, data scientists, and students studying statistical estimation methods, particularly those interested in uniform distributions and maximum likelihood estimation techniques.

Julio1
Messages
66
Reaction score
0
Find maximum likelihood estimators of an sample of size $n$ if $X\sim U(0,\theta].$

Hello MHB :)! Can any user help me please :)! I don't how follow...
 
Physics news on Phys.org
Julio said:
Find maximum likelihood estimators of an sample of size $n$ if $X\sim U(0,\theta].$

Hello MHB :)! Can any user help me please :)! I don't how follow...

Hi Julio!

We want to maximize the likelihood that some $\theta$ is the right one.

The likelihood that a certain $\theta$ is the right one given a random sample is:
$$\mathcal L(\theta; x_1, ..., x_n) = f(x_1|\theta) \times f(x_2|\theta) \times ... \times f(x_n|\theta)$$
where $f$ is the probability density function.

Since $X\sim U(0,\theta]$, $f$ is given by:
$$f(x|\theta)=\begin{cases}\frac 1 \theta&\text{if }0 < x \le \theta \\ 0 &\text{otherwise}\end{cases}$$

Can you tell for which $\theta$ the likelihood will be at its maximum?
 
Thanks I like Serena :).

Good have that the likelihood function is $L(\theta)=\dfrac{1}{\theta^n}.$ Then applying logarithm we have that

$\ln (L(\theta))=\ln(\dfrac{1}{\theta^n})=-n\ln(\theta).$ Now for derivation with respect $\theta$ we have that $\dfrac{\partial}{\partial \theta}(\ln L(\theta))=\dfrac{\partial}{\partial \theta}(-n\ln(\theta))=-\dfrac{n}{\theta}.$ Thus, match with zero have $-\dfrac{n}{\theta}=0$, i.e., $n=0$.

But why? :(, remove the parameter $\theta$?... Then I don't find an estimator for $\theta$?
 
You're welcome Julio!

What's missing in your approach is that it's not taken into account that the function is piecewise.
So we need to inspect what happens at the boundaires.

Note that any $x_i$ that is in the sample has to be $\le \theta$, because otherwise its probability is $0$.
So $\theta$ has to be at least the maximum value that is in the sample.
What happens to the likelihood if $\theta$ is bigger than that maximum value?
 
If there are an infinite number of natural numbers, and an infinite number of fractions in between any two natural numbers, and an infinite number of fractions in between any two of those fractions, and an infinite number of fractions in between any two of those fractions, and an infinite number of fractions in between any two of those fractions, and... then that must mean that there are not only infinite infinities, but an infinite number of those infinities. and an infinite number of those...

Similar threads

  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 23 ·
Replies
23
Views
4K
  • · Replies 19 ·
Replies
19
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 6 ·
Replies
6
Views
1K