# Chebychev's inequality for two random variables

1. Oct 22, 2013

### rayge

(I wasn't sure how to title this, it's just that the statement resembles Chebychev's but with two RV's.)

1. The problem statement, all variables and given/known data

Let $\sigma_1^1 = \sigma_2^2 = \sigma^2$ be the common variance of $X_1$ and $X_2$ and let [roh] (can't find the encoding for roh) be the correlation coefficient of $X_1$ and $X_2$. Show for $k>0$ that

$P[|(X_1-\mu_1) + (X_2-\mu_2)|\geq k\sigma]\leq2(1+[roh])/k^2$

2. Relevant equations
Chebychev's inequality:
$P(|X-\mu|\geq k\sigma) \leq 1/k^2$

3. The attempt at a solution

I'm really only looking for a place to start. I can try working backwords, and expanding [roh] into its definition, which is $E[(X_1-\mu_1)(X_2-\mu_2)]/\sigma_1\sigma_2$, but I really don't know how to evaluate that. I was wondering about using Markov's inequality and substituting $u(X_1,X_2)$ for $u(X_1)$, but of course there's no equation linking $X_1$ and $X_2$. Feeling stumped. Any suggestions welcome!

Last edited: Oct 22, 2013
2. Oct 22, 2013

### Staff: Mentor

I would try to find the variance and mean (okay, mean is obvious) for X1+X2. I think this converts Chebychev's inequality to the one you have to show.

The greek letter is rho, and \rho gives $\rho$.

Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted