# Estimation, bias and mean squared error

1. Jan 13, 2012

### stukbv

1. The problem statement, all variables and given/known data

(x1,x2,...xn) is modelled as observed values of independent random variables X1,X2,....Xn each with the distribution 1/θ for x in [0,θ] and 0 otherwise.
A proposed estimate of θ is m = max(x1,....xn) Calculate the distribution of the random variable M=max(X1,X2,....Xn) and considering M as an estimator for θ, its bias and mean squared error.

2. The attempt at a solution
P(max(X1,...Xn)≤m) = P(X1≤m)P(X2≤m)....P(Xn≤m)
via independent of the Xi's.

Then since they have the same distribution this is just
(m/θ)n

So to get the distribution do I just differentiate with respect to θ
Which would give me

n(m/θ)n-1 * (-m/(θ2))

Is this the right way to think about it ?

Thank you

Last edited: Jan 13, 2012
2. Jan 13, 2012

### Ray Vickson

You have the (cumulative) distribution function F(m) = Pr{M <= m}. How do you get the density function of M from that? There is a standard formula; you just need to use it.

RGV

3. Jan 14, 2012

### stukbv

I know that you differentiate to get the density function but I cant work out whether its with respect to theta (which is what I did above) or with respect to m?

4. Jan 14, 2012

### I like Serena

$f(m) = {d \over dm} F(m)$.

5. Jan 14, 2012

### Ray Vickson

The standard formula would tell you exactly what to do---no confusion!

RGV

6. Jan 14, 2012

### stukbv

I see thamk you!