Covariance of partitioned linear combination

Click For Summary

Homework Help Overview

The problem involves a random vector partitioned into two components, with a focus on calculating the covariance of linear combinations of these components. The subject area includes concepts from statistics and linear algebra, particularly related to covariance matrices and random variables.

Discussion Character

  • Exploratory, Conceptual clarification, Mathematical reasoning

Approaches and Questions Raised

  • Participants discuss the interpretation of the covariance result, questioning how a scalar and a vector can yield a row vector covariance. There are attempts to clarify the covariance structure between the linear combinations of the partitioned random vector.

Discussion Status

The discussion is active, with participants providing insights into the interpretation of covariance results and suggesting alternative approaches to the problem. There is recognition of the structure of the covariance matrix and its implications for the relationships between the random variables involved.

Contextual Notes

Some participants note potential confusion regarding the notation and definitions used in the problem, particularly in relation to covariance expressions. There is an acknowledgment of the need for clarity in the mathematical representation of the covariance between different types of random variables.

showzen
Messages
34
Reaction score
0

Homework Statement


Given random vector ##X'=[X_1,X_2,X_3,X_4]## with mean vector ##\mu '_X=[4,3,2,1]## and covariance matrix
$$\Sigma_X=\begin{bmatrix}
3&0&2&2\\
0&1&1&0\\
2&1&9&-2\\
2&0&-2&4
\end{bmatrix}.$$
Partition ##X## as
$$X=\begin{bmatrix}
X_1\\X_2\\\hline X_3\\X_4\end{bmatrix}
=\begin{bmatrix}
X^{(1)}\\\hline X^{(2)}\end{bmatrix}.$$
Let ##A=[1,2]## and ##B=\begin{bmatrix}1&-2\\2&-1\end{bmatrix}##. Find Cov##(AX^{(1)},BX^{(2)})##.

Homework Equations


Cov##(CX)=C\Sigma_X C'##
Cov##(X^{(1)},X^{(2)})=\Sigma_{12}##

The Attempt at a Solution


Cov(##AX^{(1)},BX^{(2)})=A\Sigma_{12}B'=[1,2]\begin{bmatrix}2&2\\1&0\end{bmatrix}\begin{bmatrix}1&2\\-2&-1\end{bmatrix}=[0,6]##

Although I have arrived at an answer, I do not know how to interpret it. We have scalar ##AX^{(1)}## and vector ##BX^{(2)}##, and we arrive at row vector covariance?
 
Physics news on Phys.org
in general, nice use of latex here. One nitpick: ##\text{Cov}(C \mathbf X)=C\Sigma_X C^T## is wrong. It should read ##\text{Cov}(C \mathbf X, C \mathbf X)=C\Sigma_X C^T## -- the idea of covariance with one argument only doesn't make much sense. The idea more generally here is that ##\text{Cov}(C \mathbf X, D \mathbf X)=C\Sigma_X D^T##. In general a non-symmetric covariance matrix can irk me a bit so I get your question on how it can be a (row) vector result.

The idea here is suppose you have a scalar random variable ##Y_1## and a vector of

##
\mathbf Y = \begin{bmatrix}
Y_2\\
Y_3
\end{bmatrix}##

so the covariance of ##\text{cov}(Y_1, \mathbf Y)## is just covariance of ##Y_1, Y_2## and also covariance of ##Y_1, Y_3##. Collect these results in a vector -- that's it.

- - - -

Another way to approach this problem, would be to use

##A := \left[\begin{matrix}1 & 2 & 0 & 0\\0 & 0 & 0 & 0\\0 & 0 & 0 & 0\\0 & 0 & 0 & 0\end{matrix}\right]##

and

## B := \left[\begin{matrix}0 & 0 & 0 & 0\\0 & 0 & 0 & 0\\0 & 0 & 1 & -2\\0 & 0 & 2 & -1\end{matrix}\right]##

now apply

##\text{Cov}(A \mathbf X, B \mathbf X) = A \text{Cov}( \mathbf X, \mathbf X) B^T = A \Sigma_X B^T=
\left[\begin{matrix}0 & 0 & 0 & 6\\0 & 0 & 0 & 0\\0 & 0 & 0 & 0\\0 & 0 & 0 & 0\end{matrix}\right]
##

and you can just read off the result. E.g. the covariance of the scalar ##\big(A \mathbf X\big)_1## with scalar ##\big(B \mathbf X\big)_4## is given in the top right corner of the resulting matrix. At the end of the day you're interested in that and ##\big( A\mathbf X\big)_1## with scalar ##\big(B \mathbf X\big)_3## which is given in row 1, column 3, and of course is zero.
 
I would interpret it this way: ##U = AX^{(1)}## is a linear combination of the elements of ##X^{(1)}##, which results in a scalar. ##V = BX^{(2)}## is a a pair of linear combinations of the elements of ##X^{(2)}##, which results in a column vector, ##V = \begin{bmatrix}
V_1 \\
V_2
\end{bmatrix}##. Then $$Cov(AX^{(1)}, BX^{(21)}) = Cov(U, V) =
\begin{bmatrix}
Cov(U, V_1) &
Cov(U, V_2)
\end{bmatrix}
$$.
 
Last edited:
Thanks everyone for your help. I see now that the answer is a matrix with elements that are the covariance between ##AX^{(1)}## and the elements of ##BX^{(2)}##.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 22 ·
Replies
22
Views
3K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
2
Views
1K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 5 ·
Replies
5
Views
5K
  • · Replies 5 ·
Replies
5
Views
2K