Register to reply 
Variancecovariance matrix of random vector 
Share this thread: 
#1
Jun1509, 11:31 PM

P: 1,270

Notation:
Var(Y) is the variancecovariance matrix of a random vector Y B' is the tranpose of the matrix B. 1) Let A be a m x n matrix of constants, and Y be a n x 1 random vector. Then Var(AY) = A Var(Y) A' Proof: Var(AY) = E[(AYA E(Y)) (AYA E(Y))' ] = E[A(YE(Y)) (YE(Y))' A' ] = A E[(YE(Y)) (YE(Y))'] A' = A Var(Y) A' Now, I don't understand the step in red. What theorem is that step using? I remember a theorem that says if B is a m x n matrix of constants, and X is a n x 1 random vector, then BX is a m x 1 matrix and E(BX) = B E(X), but this theorem doesn't even apply here since it requries X to be a column vector, not a matrix of any dimension. 2) Theorem: Let Y be a n x 1 random vector, and B be a n x 1 vector of constants(nonrandom), then Var(B+Y) = Var(Y). I don't see why this is true. How can we prove this? Is it also true that Var(Y+B) = Var(Y) ? Any help is greatly appreciated! 


#2
Jun1609, 07:02 AM

HW Helper
P: 1,372

For question 1: the matrix [tex] A [/tex] is constant, so it (and [tex] A' [/tex]) can be factored outside of the expectation. This is the same type of principal you use with random variables (think [tex] E(5x) = 5E(x) [/tex] ).
For 2: Again, [tex] Y [/tex] is a collection of constants, and addition of constants doesn't change the variance of a random variable. In a little more detail: [tex] \begin{align*} E(Y + B) & = \mu_Y + B \\ Var(Y+B) & = E[((Y+B)  (\mu_Y + B))((Y+B)  (\mu_Y+B))'] \\ & = E[(Y\mu_Y)(Y\mu_Y)'] = Var[Y] \end{align*} [/tex] 


#3
Jun1609, 03:41 PM

P: 1,270

1) But how can we prove it rigorously in the general case of random matrices?
i.e. how can we prove that E(AZ) = A E(Z) and E(W A') = E(W) A' ? where Z and W are any random matrices, and A is any constant matrix such that the product is defined 2) Thanks for the proof! Now I can see more rigorously why that property is true in the multivariate context. 


#4
Jun1609, 03:59 PM

Sci Advisor
HW Helper
P: 2,481

Variancecovariance matrix of random vector
1) You could start with the 2x2 case then generalize; or use induction.



#5
Jun1609, 04:16 PM

HW Helper
P: 1,372

[tex] Z = \begin{pmatrix} z_{11} & z_{12} & \dots & z_{1k} \\ z_{21} & z_{22} & \hdots & z_{2k} \\ \ddots & \ddots & \ddots & \ddots \\ z_{m1} & z_{m2} & \dots & z_{mk} \end{pmatrix} [/tex] and that your constant matrix is [tex] A [/tex] with similar notation for its entries. The [tex](r,t)[/tex] entry of the matrix [tex] AZ [/tex] is the random variable given by [tex] \sum_{l=1}^m a_{rl} z_{lt} [/tex] so the expected value of the [tex] (r,t) [/tex] entry is [tex] E\left(\sum_{l=1}^m a_{rl}z_{lt}\right) = \sum_{l=1}^m E\left(a_{rl}z_{lt}\right) = \sum_{l=1}^m a_{rl} E\left(z_{lt}\right) [/tex] The second equality is true since each [tex] a [/tex] value is a constant number and each [tex] z [/tex] is a random variable , so the ordinary rules of expectation apply. What does the equation mean? a) The left side is the expected value of the [tex] (r,t) [/tex] entry in the matrix [tex] AZ [/tex] b) The right side is the [tex] (r,t)[/tex] entry in the matrix product of [tex] A [/tex] and the expected value of [tex] Z [/tex] (call this [tex] E(Z) [/tex]) This shows that corresponding elements of [tex] E(AZ) [/tex] and [tex] A E(Z) [/tex] are equal, so [tex] E(AZ) = A E(Z) [/tex] This type of approach works whether you have random variables or random vectors. 


#6
Jun1609, 04:34 PM

P: 1,270

1) Once again, thanks for the great proof!
And I suppose the proof of E(W A') = E(W) A', with the constant matrix on the right of a random matrix W, can be done similarly, right? 


#7
Jun1609, 04:42 PM

HW Helper
P: 1,372




#8
Jun1609, 04:59 PM

P: 1,270

I am trying to modify your proof to prove that E(ZA) = E(Z) A (assuming ZA is defined), but it doesn't seem to work out...
The [tex](r,t)[/tex] entry of the matrix [tex] ZA [/tex] is the random variable given by [tex] \sum_{l=1}^m Z_{rl} a_{lt} [/tex] so the expected value of the [tex] (r,t) [/tex] entry is [tex] E\left(\sum_{l=1}^m Z_{rl}a_{lt}\right) = \sum_{l=1}^m E\left(Z_{rl}a_{lt}\right) = \sum_{l=1}^m a_{lt} E\left(Z_{rl}\right) [/tex] ????? 


#9
Jun1609, 05:09 PM

HW Helper
P: 1,372

[QUOTE=kingwinner;2239839]I am trying to modify your proof to prove that E(ZA) = E(Z) A (assuming ZA is defined), but it doesn't seem to work out...
The [tex](r,t)[/tex] entry of the matrix [tex] ZA [/tex] is the random variable given by [tex] \sum_{l=1}^m Z_{rl} a_{lt} [/tex] so the expected value of the [tex] (r,t) [/tex] entry is [tex] E\left(\sum_{l=1}^m Z_{rl}a_{lt}\right) = \sum_{l=1}^m E\left(Z_{rl}a_{lt}\right) = \sum_{l=1}^m a_{lt} E\left(Z_{rl}\right) [/tex] Remember you want the matrix [tex] A [/tex] to appear on the right, so factor the constants to the right in the sum (it doesn't matter for constants and variables, but it will make reconstructing the matrix product easier). Also make sure you have the matrices' indexes organized correctly. 


#10
Jun1609, 05:27 PM

P: 1,270

Thanks a lot, statdad! You are of great help!



Register to reply 
Related Discussions  
VarianceCovariance Matrix  Calculus & Beyond Homework  3  
Covariance of Discrete Random Variables  Calculus & Beyond Homework  2  
Multiplying random value by variance + motivation  Set Theory, Logic, Probability, Statistics  2  
Covariance of random sum  Set Theory, Logic, Probability, Statistics  3 