Help with variance sum + correlation coefficient formula

Simfish
Gold Member
Messages
811
Reaction score
2
[SOLVED] Help with variance sum + correlation coefficient formula

This is a worked example

The objective is to prove

-1 \leq \rho(X,Y) \leq 1

Then the book uses this formula...

(2) 0 \leq Var(\left \frac{X}{\sigma_x} + \frac{Y}{\sigma_y} \right)

(3) = \frac{Var(X)}{{\sigma_x}^2} + \frac{Var(Y)}{{\sigma_y}^2} + \frac{2Cov(X,Y)}{\sigma_x \sigma_y}

The question is, how does 2 lead to 3? Namely, how does Var(\frac{X}{\sigma_x} ) => \frac{Var(X)}{{\sigma_x}^2}?

Also, how does one get the idea to use formula (2) to prove (1)? It doesn't seem like a natural step
 
Last edited:
Physics news on Phys.org
sorry, I edit my posts a lot - so somehow, edited posts on PF don't edit the tex code any longer once you edit the posts enough...

Namely, how does Var(\frac{X}{\sigma_x}) => \frac{Var(X)}{{\sigma_x}^2}?
 
What is Var(aX), where a is constant?
 
aVar(X)

holy crap
i never knew my attention lapses were that bad
 
Simfishy said:
aVar(X)

That's not right. The variance of a one-dimensional random variable X is defined as \text{Var}(X) = \text{E}[(X-\text{E}(X))^2]. What does this mean in terms of scaling X by some quantity a?
 
Oh, I see.

a^2 Var(X)
 
Now onto the next question: Given two random variables X and Y, what is \text{Var}(X+Y)? Apply the definition of \text{Var}(X) to the new random variable X+Y.
 
= {Var(X)} + {Var(Y)} + {2Cov(X,Y)}

but that's from memorization - I'll try to derive it now
 
Last edited:
Var(X+Y)

= E[(X+Y)^2] - E[X+Y]^2

= E[X^2 + 2XY + Y^2] - E[X+Y]^2
= E[X^2] + 2E[XY] + E[Y^2] - (E[X] + E[Y])^2
= E[X^2] + 2E[XY] + E[Y^2] - E[X]^2 - E[Y]^2 - 2E[X]E[Y]
= (E[X^2]- E[X]^2) + (E[Y^2] - E[Y]^2) + (2E[XY]- 2E[X]E[Y])
= Var(X) + Var(Y) + 2Cov(X,Y) IF dependent
IF independent, 2E[XY] = 2E[X]E[Y]

==
Okay, can someone please address my second question?
Also, how does one get the idea to use formula (2) to prove (1)? It doesn't seem like a natural step
 
  • #10
What is \text{Var}\left(X/{{\sigma_x}}\right)?
 
  • #11
\frac{Var(X)}{{\sigma_x}^2} = 1 since Var(X) = \sigma_x, Var(Y) = \sigma_y. I have the entire proof in the book - but the first step seems unnatural (how does one get the inspiration to use 0 \leq Var(\left \frac{X}{\sigma_x} + \frac{Y}{\sigma_y} \right) for proving that correlation coefficient has absolute magnitude <= 1?
 
Last edited:
  • #12
The variance of any random variable is tautologically non-negative. Look at the definition of variance.
 
  • #13
You mean Var(X) = \sigma_x^2, Var(Y) = \sigma_y^2

but the first step seems unnatural
The idea behind correlation is to standardize variables X and Y by dividing each by its standard deviation before finding their correlation.
 
  • #14
Okay I see. Thanks. :)
 

Similar threads

Replies
1
Views
2K
Replies
42
Views
4K
Replies
8
Views
7K
Replies
5
Views
2K
Replies
20
Views
3K
Replies
1
Views
1K
Replies
6
Views
1K
Replies
1
Views
1K
Back
Top