# Homework Help: Markov's Inequality for Geometric Distribution.

Tags:
1. Nov 22, 2015

### whitejac

1. The problem statement, all variables and given/known data
Let X∼Geometric(p). Using Markov's inequality find an upper bound for P(X≥a), for a positive integer a. Compare the upper bound with the real value of P(X≥a).

Then, using Chebyshev's inequality, find an upper bound for P(|X - EX| ≥ b).

2. Relevant equations
P(X≥a) ≤ Ex / a
P(|X - EX| ≥ b) ≤ Var(x) / b2

3. The attempt at a solution

Part 1:

So the first inequality is easy enough...
Markov's inequality:
E(geometric) = 1/p ⇒ P(X≥a) = 1/ap.

Real value:
P(X≥a) = 1 - P(X≤a) = P(X = 0) + P(X = 1) +... + P(X = a-1) + P(X = a) = p + pq +... +pqa-1
What I'm having trouble with is understanding the definition of markov's inequality and how I can explain it mathematically compared to the real value of P(X≥a).

A lot of people simply say that the real value is less than markov's inequality and therefore that is a comparison. This doesn't make much sense to me in the general form because all i'd be saying is:

1-P(X≤a) < 1/ap

Part 2:
By definition, the upperbound is Var(x) / b^2 = (1-p) / (b2p2)
I have a similar issue with finding Chebyshev's inquality... I can "do" it, but I don't really know what I'm doing. My book simply states that this is calculating the difference between X and EX is limited by the variance which is fine and intuitive, but I don't really think it should be this simple...

So, are my interpretations correct? And are they explained properly...

Last edited: Nov 22, 2015
2. Nov 22, 2015

### Ray Vickson

Well, first off, be more careful with your inequalities. For integer values of "a", $P(X \geq a) = P(X > a-1) = (1-p)^{a-1}$ for the geometric case. So, the Markov inequality in this case is saying that $(1-p)^{a-1} \leq \frac{1}{pa}$. This IS true, but do you really think it is "obvious"?

Not only do a lot of people say that the Markov inequality is a bound, they are speaking the 100% truth, whether it makes sense to you or not. The point is: sometimes an exact evaluation of $P(X \geq a)$ is very difficult, but calculation of $E(X)/a$ is relatively easy. In such a case the Markov result is better than nothing, and sometimes is good enough in particular applications. Ditto for Chebychev's inequality.

3. Nov 22, 2015

### whitejac

I apologize, your inequality was correct. I meant to fix that in my edit. Now, I guess that this is why the geometric seems a little obvious to me...
(1-p)a-1 will give a fraction to an exponent. This will make an exponentially smaller answer for large "a." However, Markov's rule is linear. So it will always be much higher.

Take this case of (a = 3, p = 1/2)
We see already a large difference between the two sides:
P(X ≥ 3) = (1-p)2 = 1/4

E[x] = (1/p) / a = (2 / a) = 2/3. That's only the third term. I've heard that because these apply so generally they cannot be that specific, but knowing the results of this case I can easily see it being truer elsewhere.