Calculating Covariance with a Random Vector

Click For Summary
SUMMARY

The discussion focuses on calculating the covariance of two linear combinations of a random vector \(X\) with mean \(\mu_X = 0\) and covariance matrix \(K_{XX} = I\). The specific linear combinations are defined by vectors \(a = (1, 1, 0, 0)\) and \(b = (0, 1, 1, 0)\). The covariance is computed as \(Cov(a^T X, b^T X) = E[X_2^2] = 1\), confirming that the covariance between the two combinations is indeed 1, based on the properties of the covariance matrix. The interpretation of the covariance matrix \(K_{XX}\) is crucial for understanding the calculations.

PREREQUISITES
  • Understanding of random vectors and their properties
  • Familiarity with covariance and correlation concepts
  • Knowledge of linear algebra, particularly matrix operations
  • Basic statistics, specifically expectations and variances
NEXT STEPS
  • Study the properties of covariance matrices in multivariate distributions
  • Learn about the implications of zero mean in random variables
  • Explore linear transformations of random variables and their effects on covariance
  • Investigate the use of covariance in statistical inference and machine learning
USEFUL FOR

Students and professionals in statistics, data science, and machine learning who are working with multivariate distributions and need to understand covariance calculations in the context of random vectors.

ElijahRockers
Gold Member
Messages
260
Reaction score
10

Homework Statement


Let ##X## be a random variable such that ##\mu_X = 0## and ##K_{XX} = I##.
Find ##Cov(a^T X, b^T X)## for ##a = (1, 1, 0, 0)## and ##b = (0, 1, 1, 0)##.

The Attempt at a Solution


I guess I am assuming that ##X## is a 4 element random vector. I can't know values of the random variables, but I know their mean, and I think from ##K_{XX} = I## that
##E[X_i X_j] = 0, i≠ j##
##E[X_i X_j] = 1, i= j##

So..

##a^T X = X_1 + X_2 = A##
##b^T X = X_2 + X_3 = B##
##Cov(A,B) = E[AB]-E[A]E[ B]##

##E[A]## and ##E[ B]## are 0, so

##Cov(A,B) = E[AB] = E[X_1 X_2 + X_1 X_3 + X_2 X_2 + X_2 X_3]##

From ##K_{XX}##, ##E[AB] = E[X_2 X_2] = 1 = Cov(A,B)##

Not sure if this is correct or not.
 
Physics news on Phys.org
ElijahRockers said:

Homework Statement


Let ##X## be a random variable such that ##\mu_X = 0## and ##K_{XX} = I##.
Find ##Cov(a^T X, b^T X)## for ##a = (1, 1, 0, 0)## and ##b = (0, 1, 1, 0)##.

The Attempt at a Solution


I guess I am assuming that ##X## is a 4 element random vector. I can't know values of the random variables, but I know their mean, and I think from ##K_{XX} = I## that
##E[X_i X_j] = 0, i≠ j##
##E[X_i X_j] = 1, i= j##

So..

##a^T X = X_1 + X_2 = A##
##b^T X = X_2 + X_3 = B##
##Cov(A,B) = E[AB]-E[A]E[ B]##

##E[A]## and ##E[ B]## are 0, so

##Cov(A,B) = E[AB] = E[X_1 X_2 + X_1 X_3 + X_2 X_2 + X_2 X_3]##

From ##K_{XX}##, ##E[AB] = E[X_2 X_2] = 1 = Cov(A,B)##

Not sure if this is correct or not.

It is correct if your interpretation of ##K_{XX}## is correct (which I cannot speak to because the notation is unfamiliar to me).
 
Ray Vickson said:
It is correct if your interpretation of ##K_{XX}## is correct (which I cannot speak to because the notation is unfamiliar to me).
##K_{XX}## is the covariance matrix of ##X##, where ##K_{XX_{i,j}} = E[X_i X_j] - E[X_i]E[X_j]## is each element in the matrix... I believe.
 

Similar threads

Replies
7
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 9 ·
Replies
9
Views
1K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 0 ·
Replies
0
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 12 ·
Replies
12
Views
2K