# Proving Markov's Inequality (Probability)

1. Jan 20, 2013

### TelusPig

1. The problem statement, all variables and given/known data

If $g(x)\ge 0$, then for any constant $c>0, r>0$:

$P(g(X)\ge c)\le \frac{E((g(X))^r)}{c^r}$

2. Relevant equations

I know that $E(g(X))=\int_0^\infty g(x)f(x)dx$ if $g(x)\ge 0$ where $f(x)$ is the pdf of $X$.

3. The attempt at a solution

I tried following a similar proof to what I found here (on page 2: http://www.stat.cmu.edu/~larry/=stat705/Lecture2.pdf).

I'm really stuck on this. I've tried to look on the Internet for a proof, but the inequalities I find involve X, not g(X) and there isn't any exponent $r$. I tried doing something similar by replacing X with g(X) but it doesn't quite work out. I still need an exponent $r$ and I don't know how the last step would get me to write something like $c^r P(g(x)\ge c)$

2. Jan 20, 2013

### jbunniii

Suppose $g(x)$ has a pdf. Let's call the pdf $f$. Then
$$P(g(x) \geq c) = \int_c^\infty f(y) dy$$
In this integral, we have $y \geq c$. If $r > 0$ and $c > 0$, then this implies $y^r \geq c^r$. This in turn is equivalent to $1 \leq y^r/c^r$. What's the natural next step?

For the more general case in which $g(x)$ does not necessarily admit a pdf, the proof should be similar, except you would use a Stieltjes integral.

Last edited: Jan 20, 2013