# Homework Help: Maximum Likelihood

Tags:
1. Feb 5, 2016

### dspampi

1. The problem statement, all variables and given/known data
Let X1, X2,....Xn be a random sample from pdf,
f(x|θ) = θx-2 where 0 < θ ≤ x < ∞

Find the MLE of θ

My attempt:

Likelihood fxn: L(θ|x) = ∏θx-2 = θn∏ θx-2

And to find MLE, I take Log of that function and partial derivative (w.r.t θ, of log L(θ|x) and set that = 0, and get: n/θ = 0

However, I realize that θ ≤ x and θ > 0...what do I need to do to incorporate this to my likelihood function?
In class we discuss about Fisher Information and I have a guess that it has some involvement with this problem, but I'm not sure why and what we can use Fisher Information for this problem?[/SUP][/SUP][/SUP][/SUP][/SUB][/SUB][/SUB]

Last edited: Feb 5, 2016
2. Feb 5, 2016

### RUber

It looks like you are on the right track.
You are looking to maximize
$L(\theta | x ) = \prod_{i=1}^n \theta X_i^{-2} = \theta^n \prod_{i=1}^n x_i^{-2}$
for a given vector x.
If you note that $\prod_{i=1}^n x_i^{-2}$ is a constant for any fixed vector X, how do you maximize the likelihood?
$\max L(\theta|x) = C\theta^n$
You should maximize theta, which is the same result you found with the answer $\frac{n}{\theta} = 0$.
Now you apply the constraints. For a vector $x = [X_1, X_2, ..., X_n]$, what is the maximum allowable $\theta$?

3. Feb 5, 2016

### Ray Vickson

In constrained optimization it is often wrong to set derivatives to zero, because the results violate the constraints. In your problem, you want to solve
$$\max_{\theta} L(\theta|x) = \theta^n \prod_{i=1}^n x_i^{-2}, \; \text{subject to} \; 0 < \theta \leq \min\, (x_1,x_2, \ldots, x_n)$$
The derivative will definitely not = 0 at the optimal solution.

If $\underline{x} = \min\, (x_1,x_2, \ldots, x_n)$, what is the solution of the problem of maximizing $\theta^n$ over the region $0 < \theta \leq \underline{x}$?

Last edited: Feb 5, 2016