# Let X be a continuous random variable. What value of b minimizes E (|X-b|)? Giv

1. Sep 4, 2011

### johnG2011

Let X be a continuous random variable. What value of b minimizes E(|X-b|)? Giv

1. The problem statement, all variables and given/known data

Let X be a continuous random variable. What value of b minimizes E(|X-b|)? Give the derivation

3. The attempt at a solution

E(|X - b|)

E[e - $\bar{x}$] = E(X)

E(|E[e - $\bar{x}$] - b|)

so ?,.... 0 = E(|E[e - $\bar{x}$] - E|)

but this is a graduate course, I have a funny feeling that I am supposed to derive this using a the integral of an Expected value.

2. Sep 4, 2011

### lanedance

Re: Let X be a continuous random variable. What value of b minimizes E(|X-b|)?

what is e? Its not clear what your steps are attempting

I think the integral would be a good way to approach this

3. Sep 4, 2011

### johnG2011

Re: Let X be a continuous random variable. What value of b minimizes E(|X-b|)?

The e is supposed to be an observation in the sample set

4. Sep 4, 2011

### lanedance

Re: Let X be a continuous random variable. What value of b minimizes E(|X-b|)?

ok well its still not real clear what you're trying to do

i would try and write the expectation in integral form and consider differentiating, though you may need to be careful with the absolute value
$$f(b) = E[|X-b|] = \int_{-\infty}^{\infty} dx.p(x).|x-b|$$

Last edited: Sep 5, 2011
5. Sep 4, 2011

### lanedance

Re: Let X be a continuous random variable. What value of b minimizes E(|X-b|)?

if the absolute value sign gives you trouble, you could consider using b to break up the integral into a sum of two integrals (x<b and x>b), this however will complicate the differentiation as now b appears in the integration limit also

Last edited: Sep 5, 2011