Variance-covariance matrix of random vector

AI Thread Summary
The discussion focuses on the variance-covariance matrix of random vectors, specifically addressing the properties of variance under linear transformations. It establishes that for a constant matrix A and a random vector Y, the variance of the transformed vector AY is given by Var(AY) = A Var(Y) A'. Participants seek clarification on the underlying theorem that allows the expectation operator to factor out constant matrices, as well as the implications of adding constants to random vectors on their variance. The conversation also explores rigorous proofs for these properties in the context of random matrices, emphasizing the importance of maintaining proper indexing and organization in matrix operations. Overall, the thread highlights key principles in multivariate statistics and the handling of random variables.
kingwinner
Messages
1,266
Reaction score
0
Notation:
Var(Y) is the variance-covariance matrix of a random vector Y
B' is the tranpose of the matrix B.

1) Let A be a m x n matrix of constants, and Y be a n x 1 random vector. Then Var(AY) = A Var(Y) A'

Proof:
Var(AY)
= E[(AY-A E(Y)) (AY-A E(Y))' ]
= E[A(Y-E(Y)) (Y-E(Y))' A' ]
= A E[(Y-E(Y)) (Y-E(Y))'] A'
= A Var(Y) A'

Now, I don't understand the step in red. What theorem is that step using?
I remember a theorem that says if B is a m x n matrix of constants, and X is a n x 1 random vector, then BX is a m x 1 matrix and E(BX) = B E(X), but this theorem doesn't even apply here since it requries X to be a column vector, not a matrix of any dimension.


2) Theorem: Let Y be a n x 1 random vector, and B be a n x 1 vector of constants(nonrandom), then Var(B+Y) = Var(Y).

I don't see why this is true. How can we prove this?
Is it also true that Var(Y+B) = Var(Y) ?


Any help is greatly appreciated!
 
Physics news on Phys.org
For question 1: the matrix A is constant, so it (and A') can be factored outside of the expectation. This is the same type of principal you use with random variables (think E(5x) = 5E(x) ).

For 2: Again, Y is a collection of constants, and addition of constants doesn't change the variance of a random variable. In a little more detail:

<br /> \begin{align*}<br /> E(Y + B) &amp; = \mu_Y + B \\<br /> Var(Y+B) &amp; = E[((Y+B) - (\mu_Y + B))((Y+B) - (\mu_Y+B))&#039;] \\<br /> &amp; = E[(Y-\mu_Y)(Y-\mu_Y)&#039;] = Var[Y]<br /> \end{align*}<br />
 
1) But how can we prove it rigorously in the general case of random matrices?
i.e. how can we prove that
E(AZ) = A E(Z)
and E(W A') = E(W) A' ?
where Z and W are any random matrices, and A is any constant matrix such that the product is defined

2) Thanks for the proof! Now I can see more rigorously why that property is true in the multivariate context.
 
Last edited:
1) You could start with the 2x2 case then generalize; or use induction.
 
kingwinner said:
1) But how can we prove it rigorously in the general case of random matrices?
i.e. how can we prove that
E(AZ) = A E(Z)
and E(W A') = E(W) A' ?
where Z and W are any random matrices, and A is any constant matrix such that the product is defined

2) Thanks for the proof! Now I can see more rigorously why that property is true in the multivariate context.

Suppose your random matrix is (using the definition of matrix multiplication)

<br /> Z = \begin{pmatrix} z_{11} &amp; z_{12} &amp; \dots &amp; z_{1k} \\<br /> z_{21} &amp; z_{22} &amp; \hdots &amp; z_{2k} \\<br /> \ddots &amp; \ddots &amp; \ddots &amp; \ddots \\<br /> z_{m1} &amp; z_{m2} &amp; \dots &amp; z_{mk}<br /> \end{pmatrix}<br />

and that your constant matrix is A with similar notation for its entries.
The (r,t) entry of the matrix AZ is the random variable given by

<br /> \sum_{l=1}^m a_{rl} z_{lt}<br />

so the expected value of the (r,t) entry is

<br /> E\left(\sum_{l=1}^m a_{rl}z_{lt}\right) = \sum_{l=1}^m E\left(a_{rl}z_{lt}\right) = \sum_{l=1}^m a_{rl} E\left(z_{lt}\right)<br />

The second equality is true since each a value is a constant number and each z is a random variable , so the ordinary rules of expectation apply. What does the equation mean?

a) The left side is the expected value of the (r,t) entry in the matrix AZ

b) The right side is the (r,t) entry in the matrix product of A and the expected value of Z (call this E(Z))

This shows that corresponding elements of E(AZ) and A E(Z) are equal, so

<br /> E(AZ) = A E(Z)<br />


This type of approach works whether you have random variables or random vectors.
 
1) Once again, thanks for the great proof!

And I suppose the proof of E(W A') = E(W) A', with the constant matrix on the right of a random matrix W, can be done similarly, right?
 
kingwinner said:
1) Once again, thanks for the great proof!

And I suppose the proof of E(W A') = E(W) A', with the constant matrix on the right of a random matrix W, can be done similarly, right?

Yes, as can the derivations for the case of random and constant vectors.
 
I am trying to modify your proof to prove that E(ZA) = E(Z) A (assuming ZA is defined), but it doesn't seem to work out...

The (r,t) entry of the matrix ZA is the random variable given by

<br /> \sum_{l=1}^m Z_{rl} a_{lt}<br />

so the expected value of the (r,t) entry is

<br /> E\left(\sum_{l=1}^m Z_{rl}a_{lt}\right) = \sum_{l=1}^m E\left(Z_{rl}a_{lt}\right) = \sum_{l=1}^m a_{lt} E\left(Z_{rl}\right) <br />


?
 
kingwinner said:
I am trying to modify your proof to prove that E(ZA) = E(Z) A (assuming ZA is defined), but it doesn't seem to work out...

The (r,t) entry of the matrix ZA is the random variable given by

<br /> \sum_{l=1}^m Z_{rl} a_{lt}<br />

so the expected value of the (r,t) entry is

<br /> E\left(\sum_{l=1}^m Z_{rl}a_{lt}\right) = \sum_{l=1}^m E\left(Z_{rl}a_{lt}\right) = \sum_{l=1}^m a_{lt} E\left(Z_{rl}\right) <br />

Remember you want the matrix A to appear on the right, so factor the constants to the right in the sum (it doesn't matter for constants and variables, but it will make reconstructing the matrix product easier). Also make sure you have the matrices' indexes organized correctly.
 
  • #10
Thanks a lot, statdad! You are of great help!
 

Similar threads

Back
Top