Silviu said:
Thank you for this. However, this doesn't really test independence, I think. You can get zero covariance, and ##U## and ##V## still be dependent. Isn't this right? (I am new to this, so I am not sure if the normal distribution has anything particular to ensure independence based on correlation).
I'm a bit concerned that you're unaware of the fact that zero covariance is a pre-req for independence for any random variables. I was quite clear when I said
StoneTemplePython said:
If ##U \perp V## then they must have zero covariance.
Then I very showed that covariance is non-zero. The result is simple, general and true for any linear combination example like the one you gave for any type of random variables (except in the very special cases (a) where the random variables are deterministic or (b) they don't have a
second first moment).
Ray mentioned some additional structural subtleties for Gaussians... they are important but subtle -- and I like easy stuff.
I also like working with zero mean random variables wherever possible. You should be aware that for two random variables, ##X## and ##Y##, ##\text{cov}(X, Y)## is the same, whether or not you shift (the mean of) ##X## and/or ##Y## - hence you can alway choose to work with zero mean random variables to make the point. The result -- i.e. that shifting (and especially
centering) doesn't change Covariance is something you should know.
Here's the math
- - - -
##\text{cov}(X,Y) = E\big[XY\big] - E\big[X\big]E\big[Y\big]##
now consider shifting ##X## by some fixed value called b.$$\text{cov}(X + b,Y) = E\big[(X+b)Y\big] - E\big[X+b\big]E\big[Y\big]\\
\text{cov}(X + b,Y) = E\big[XY+ bY\big] - (E\big[X\big] + E\big[b\big])E\big[Y\big]\\
\text{cov}(X + b,Y) = E\big[XY\big] +E\big[bY\big] - (E\big[X\big] + b)E\big[Y\big]\\
\text{cov}(X + b,Y) = E\big[XY\big] + bE\big[Y\big] - (E\big[X\big]E\big[Y\big] + bE\big[Y\big])\\
\text{cov}(X + b,Y) = E\big[XY\big] - E\big[X\big]E\big[Y\big] \\
\text{cov}(X + b,Y) = \text{cov}(X, Y)$$