Variance-covariance matrix of random vector

In summary: Therefore, in summary, in the first conversation, it is proven that the variance-covariance matrix of the product of a constant matrix and a random vector is equal to the constant matrix multiplied by the variance-covariance matrix of the random vector multiplied by the transpose of the constant matrix. This is due to the fact that the constant matrix can be factored outside of the expected value, similar to the properties of random variables. In the second conversation, it is proven that the variance of a random vector is unchanged when a constant vector is added to it, and this also applies when the constant vector is on the right side of the random vector. These proofs can be extended to the general case of random matrices and constant matrices.
  • #1
kingwinner
1,270
0
Notation:
Var(Y) is the variance-covariance matrix of a random vector Y
B' is the tranpose of the matrix B.

1) Let A be a m x n matrix of constants, and Y be a n x 1 random vector. Then Var(AY) = A Var(Y) A'

Proof:
Var(AY)
= E[(AY-A E(Y)) (AY-A E(Y))' ]
= E[A(Y-E(Y)) (Y-E(Y))' A' ]
= A E[(Y-E(Y)) (Y-E(Y))'] A'
= A Var(Y) A'

Now, I don't understand the step in red. What theorem is that step using?
I remember a theorem that says if B is a m x n matrix of constants, and X is a n x 1 random vector, then BX is a m x 1 matrix and E(BX) = B E(X), but this theorem doesn't even apply here since it requries X to be a column vector, not a matrix of any dimension.


2) Theorem: Let Y be a n x 1 random vector, and B be a n x 1 vector of constants(nonrandom), then Var(B+Y) = Var(Y).

I don't see why this is true. How can we prove this?
Is it also true that Var(Y+B) = Var(Y) ?


Any help is greatly appreciated!
 
Physics news on Phys.org
  • #2
For question 1: the matrix [tex] A [/tex] is constant, so it (and [tex] A' [/tex]) can be factored outside of the expectation. This is the same type of principal you use with random variables (think [tex] E(5x) = 5E(x) [/tex] ).

For 2: Again, [tex] Y [/tex] is a collection of constants, and addition of constants doesn't change the variance of a random variable. In a little more detail:

[tex]
\begin{align*}
E(Y + B) & = \mu_Y + B \\
Var(Y+B) & = E[((Y+B) - (\mu_Y + B))((Y+B) - (\mu_Y+B))'] \\
& = E[(Y-\mu_Y)(Y-\mu_Y)'] = Var[Y]
\end{align*}
[/tex]
 
  • #3
1) But how can we prove it rigorously in the general case of random matrices?
i.e. how can we prove that
E(AZ) = A E(Z)
and E(W A') = E(W) A' ?
where Z and W are any random matrices, and A is any constant matrix such that the product is defined

2) Thanks for the proof! Now I can see more rigorously why that property is true in the multivariate context.
 
Last edited:
  • #4
1) You could start with the 2x2 case then generalize; or use induction.
 
  • #5
kingwinner said:
1) But how can we prove it rigorously in the general case of random matrices?
i.e. how can we prove that
E(AZ) = A E(Z)
and E(W A') = E(W) A' ?
where Z and W are any random matrices, and A is any constant matrix such that the product is defined

2) Thanks for the proof! Now I can see more rigorously why that property is true in the multivariate context.

Suppose your random matrix is (using the definition of matrix multiplication)

[tex]
Z = \begin{pmatrix} z_{11} & z_{12} & \dots & z_{1k} \\
z_{21} & z_{22} & \hdots & z_{2k} \\
\ddots & \ddots & \ddots & \ddots \\
z_{m1} & z_{m2} & \dots & z_{mk}
\end{pmatrix}
[/tex]

and that your constant matrix is [tex] A [/tex] with similar notation for its entries.
The [tex](r,t)[/tex] entry of the matrix [tex] AZ [/tex] is the random variable given by

[tex]
\sum_{l=1}^m a_{rl} z_{lt}
[/tex]

so the expected value of the [tex] (r,t) [/tex] entry is

[tex]
E\left(\sum_{l=1}^m a_{rl}z_{lt}\right) = \sum_{l=1}^m E\left(a_{rl}z_{lt}\right) = \sum_{l=1}^m a_{rl} E\left(z_{lt}\right)
[/tex]

The second equality is true since each [tex] a [/tex] value is a constant number and each [tex] z [/tex] is a random variable , so the ordinary rules of expectation apply. What does the equation mean?

a) The left side is the expected value of the [tex] (r,t) [/tex] entry in the matrix [tex] AZ [/tex]

b) The right side is the [tex] (r,t)[/tex] entry in the matrix product of [tex] A [/tex] and the expected value of [tex] Z [/tex] (call this [tex] E(Z) [/tex])

This shows that corresponding elements of [tex] E(AZ) [/tex] and [tex] A E(Z) [/tex] are equal, so

[tex]
E(AZ) = A E(Z)
[/tex]


This type of approach works whether you have random variables or random vectors.
 
  • #6
1) Once again, thanks for the great proof!

And I suppose the proof of E(W A') = E(W) A', with the constant matrix on the right of a random matrix W, can be done similarly, right?
 
  • #7
kingwinner said:
1) Once again, thanks for the great proof!

And I suppose the proof of E(W A') = E(W) A', with the constant matrix on the right of a random matrix W, can be done similarly, right?

Yes, as can the derivations for the case of random and constant vectors.
 
  • #8
I am trying to modify your proof to prove that E(ZA) = E(Z) A (assuming ZA is defined), but it doesn't seem to work out...

The [tex](r,t)[/tex] entry of the matrix [tex] ZA [/tex] is the random variable given by

[tex]
\sum_{l=1}^m Z_{rl} a_{lt}
[/tex]

so the expected value of the [tex] (r,t) [/tex] entry is

[tex]
E\left(\sum_{l=1}^m Z_{rl}a_{lt}\right) = \sum_{l=1}^m E\left(Z_{rl}a_{lt}\right) = \sum_{l=1}^m a_{lt} E\left(Z_{rl}\right)
[/tex]


?
 
  • #9
kingwinner said:
I am trying to modify your proof to prove that E(ZA) = E(Z) A (assuming ZA is defined), but it doesn't seem to work out...

The [tex](r,t)[/tex] entry of the matrix [tex] ZA [/tex] is the random variable given by

[tex]
\sum_{l=1}^m Z_{rl} a_{lt}
[/tex]

so the expected value of the [tex] (r,t) [/tex] entry is

[tex]
E\left(\sum_{l=1}^m Z_{rl}a_{lt}\right) = \sum_{l=1}^m E\left(Z_{rl}a_{lt}\right) = \sum_{l=1}^m a_{lt} E\left(Z_{rl}\right)
[/tex]

Remember you want the matrix [tex] A [/tex] to appear on the right, so factor the constants to the right in the sum (it doesn't matter for constants and variables, but it will make reconstructing the matrix product easier). Also make sure you have the matrices' indexes organized correctly.
 
  • #10
Thanks a lot, statdad! You are of great help!
 

What is a variance-covariance matrix?

A variance-covariance matrix is a matrix that shows the variances and covariances of a set of random variables. It is used to describe the relationships between multiple variables in a dataset.

How is a variance-covariance matrix calculated?

A variance-covariance matrix is calculated by taking the variances of each variable on the diagonal, and the covariances between each pair of variables on the off-diagonal entries. The formula for calculating the covariance between two variables is: cov(X,Y) = E[(X-E[X])(Y-E[Y])], where E is the expected value.

What does the variance-covariance matrix tell us about a dataset?

The variance-covariance matrix provides important information about the relationships between variables in a dataset. It can tell us which variables are highly correlated, which variables have a larger variance or spread, and how much each variable contributes to the overall variance of the dataset.

What are some common uses of the variance-covariance matrix?

The variance-covariance matrix is commonly used in statistical analysis, machine learning, and finance. It can be used to calculate the risk and return of a portfolio of assets, to identify multicollinearity in regression models, and to perform principal component analysis.

How can the variance-covariance matrix be interpreted?

The variance-covariance matrix can be interpreted as a measure of the strength and direction of the linear relationship between variables. A high covariance between two variables indicates a strong positive or negative relationship, while a low covariance indicates little or no relationship. The variances on the diagonal represent the amount of variability in each variable, with larger variances indicating a wider spread of data points.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
841
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
827
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
953
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
836
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
14
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
Back
Top