# Variance-covariance matrix of random vector

1. Jun 15, 2009

### kingwinner

Notation:
Var(Y) is the variance-covariance matrix of a random vector Y
B' is the tranpose of the matrix B.

1) Let A be a m x n matrix of constants, and Y be a n x 1 random vector. Then Var(AY) = A Var(Y) A'

Proof:
Var(AY)
= E[(AY-A E(Y)) (AY-A E(Y))' ]
= E[A(Y-E(Y)) (Y-E(Y))' A' ]
= A E[(Y-E(Y)) (Y-E(Y))'] A'
= A Var(Y) A'

Now, I don't understand the step in red. What theorem is that step using?
I remember a theorem that says if B is a m x n matrix of constants, and X is a n x 1 random vector, then BX is a m x 1 matrix and E(BX) = B E(X), but this theorem doesn't even apply here since it requries X to be a column vector, not a matrix of any dimension.

2) Theorem: Let Y be a n x 1 random vector, and B be a n x 1 vector of constants(nonrandom), then Var(B+Y) = Var(Y).

I don't see why this is true. How can we prove this?
Is it also true that Var(Y+B) = Var(Y) ?

Any help is greatly appreciated!

2. Jun 16, 2009

For question 1: the matrix $$A$$ is constant, so it (and $$A'$$) can be factored outside of the expectation. This is the same type of principal you use with random variables (think $$E(5x) = 5E(x)$$ ).

For 2: Again, $$Y$$ is a collection of constants, and addition of constants doesn't change the variance of a random variable. In a little more detail:

\begin{align*} E(Y + B) & = \mu_Y + B \\ Var(Y+B) & = E[((Y+B) - (\mu_Y + B))((Y+B) - (\mu_Y+B))'] \\ & = E[(Y-\mu_Y)(Y-\mu_Y)'] = Var[Y] \end{align*}

3. Jun 16, 2009

### kingwinner

1) But how can we prove it rigorously in the general case of random matrices?
i.e. how can we prove that
E(AZ) = A E(Z)
and E(W A') = E(W) A' ?
where Z and W are any random matrices, and A is any constant matrix such that the product is defined

2) Thanks for the proof! Now I can see more rigorously why that property is true in the multivariate context.

Last edited: Jun 16, 2009
4. Jun 16, 2009

### EnumaElish

1) You could start with the 2x2 case then generalize; or use induction.

5. Jun 16, 2009

Suppose your random matrix is (using the definition of matrix multiplication)

$$Z = \begin{pmatrix} z_{11} & z_{12} & \dots & z_{1k} \\ z_{21} & z_{22} & \hdots & z_{2k} \\ \ddots & \ddots & \ddots & \ddots \\ z_{m1} & z_{m2} & \dots & z_{mk} \end{pmatrix}$$

and that your constant matrix is $$A$$ with similar notation for its entries.
The $$(r,t)$$ entry of the matrix $$AZ$$ is the random variable given by

$$\sum_{l=1}^m a_{rl} z_{lt}$$

so the expected value of the $$(r,t)$$ entry is

$$E\left(\sum_{l=1}^m a_{rl}z_{lt}\right) = \sum_{l=1}^m E\left(a_{rl}z_{lt}\right) = \sum_{l=1}^m a_{rl} E\left(z_{lt}\right)$$

The second equality is true since each $$a$$ value is a constant number and each $$z$$ is a random variable , so the ordinary rules of expectation apply. What does the equation mean?

a) The left side is the expected value of the $$(r,t)$$ entry in the matrix $$AZ$$

b) The right side is the $$(r,t)$$ entry in the matrix product of $$A$$ and the expected value of $$Z$$ (call this $$E(Z)$$)

This shows that corresponding elements of $$E(AZ)$$ and $$A E(Z)$$ are equal, so

$$E(AZ) = A E(Z)$$

This type of approach works whether you have random variables or random vectors.

6. Jun 16, 2009

### kingwinner

1) Once again, thanks for the great proof!

And I suppose the proof of E(W A') = E(W) A', with the constant matrix on the right of a random matrix W, can be done similarly, right?

7. Jun 16, 2009

Yes, as can the derivations for the case of random and constant vectors.

8. Jun 16, 2009

### kingwinner

I am trying to modify your proof to prove that E(ZA) = E(Z) A (assuming ZA is defined), but it doesn't seem to work out...

The $$(r,t)$$ entry of the matrix $$ZA$$ is the random variable given by

$$\sum_{l=1}^m Z_{rl} a_{lt}$$

so the expected value of the $$(r,t)$$ entry is

$$E\left(\sum_{l=1}^m Z_{rl}a_{lt}\right) = \sum_{l=1}^m E\left(Z_{rl}a_{lt}\right) = \sum_{l=1}^m a_{lt} E\left(Z_{rl}\right)$$

?????

9. Jun 16, 2009