Can the Covariance of Random Vectors be Bounded by their Norms?

In summary: B^T) \operatorname{tr}(A) \operatorname{tr}(B)}= \sqrt{\operatorname{\mathbb{Var}}(X^T) \operatorname{\mathbb{Var}}(Y^T) \operatorname{\mathbb{Var}}(Z^T) \operatorname{\mathbb{Var}}(W^T)} \operatorname{tr}(A) \operatorname{tr}(B)= \operatorname{\mathbb{C}ov}\left(X^TY;Z^TW\right) \operatorname{tr}(A) \operatorname{tr}(B)Thus, we have shown that \operatorname{\mathbb{
  • #1
Pere Callahan
586
1
Hi there,

I am trying to prove the following. For any random vectors X,Y,Z,W in [itex]\mathbb{R}^d[/itex] and deterministic [itex]d\times d[/itex] matrices A,B the covariance
[tex]
\operatorname{\mathbb{C}ov}\left(X^TAY;Z^TBW\right)
[/tex]
can in some way be bounded by the covariance
[tex]
\operatorname{\mathbb{C}ov}\left(X^TY;Z^TW\right)
[/tex]
and the norms of the matrices A and B. This is trivial if d=1, because then the first covariance is just AB times the second one, but I could not manage to prove something analogous in higher dimensions.

Any hints or tips are deeply appreciated,

Pere
 
Physics news on Phys.org
  • #2


Dear Pere,

Thank you for your question. This is an interesting problem that has applications in various fields such as statistics, signal processing, and machine learning. After some consideration, I believe I have found a way to prove the desired bound.

Firstly, let us define the random vectors X' = AX and Y' = BY. Then, the covariance \operatorname{\mathbb{C}ov}\left(X^TAY;Z^TBW\right) can be rewritten as \operatorname{\mathbb{C}ov}\left(X'^TY';Z^TW\right). This is because the matrices A and B are deterministic and therefore can be moved outside of the covariance operator.

Next, we can use the Cauchy-Schwarz inequality to bound the covariance as follows:

\operatorname{\mathbb{C}ov}\left(X'^TY';Z^TW\right) \leq \sqrt{\operatorname{\mathbb{Var}}(X'^T) \operatorname{\mathbb{Var}}(Y'^T)} \sqrt{\operatorname{\mathbb{Var}}(Z^T) \operatorname{\mathbb{Var}}(W^T)}

= \sqrt{\operatorname{\mathbb{Var}}(X^T) \operatorname{\mathbb{Var}}(Y^T) \operatorname{\mathbb{Var}}(Z^T) \operatorname{\mathbb{Var}}(W^T)} \sqrt{\operatorname{tr}(A^TA) \operatorname{tr}(B^TB)}

= \sqrt{\operatorname{\mathbb{Var}}(X^T) \operatorname{\mathbb{Var}}(Y^T) \operatorname{\mathbb{Var}}(Z^T) \operatorname{\mathbb{Var}}(W^T)} \sqrt{\operatorname{tr}(A^T) \operatorname{tr}(A) \operatorname{tr}(B^T) \operatorname{tr}(B)}

= \sqrt{\operatorname{\mathbb{Var}}(X^T) \operatorname{\mathbb{Var}}(Y^T) \operatorname{\mathbb{Var}}(Z^T) \operatorname{\mathbb{Var}}(W^T)} \sqrt{\operatorname{tr}(A^T) \operatorname
 

Related to Can the Covariance of Random Vectors be Bounded by their Norms?

1. What is the covariance inequality?

The covariance inequality is a mathematical concept that states that the covariance between two variables is always less than or equal to the product of their standard deviations.

2. How is the covariance inequality used?

The covariance inequality is often used in statistics and probability to analyze the relationship between two variables and make predictions about their behavior. It is also used in various fields of science, such as finance and engineering, to understand the correlation between different factors.

3. What does a high covariance inequality mean?

A high covariance inequality indicates a strong positive relationship between two variables. This means that as one variable increases, the other variable also tends to increase. However, it is important to note that correlation does not necessarily imply causation.

4. How is the covariance inequality different from correlation?

The covariance inequality and correlation are closely related concepts, but they are not the same. While the covariance inequality measures the strength of the relationship between two variables, correlation measures both the strength and direction of the relationship. Additionally, correlation is a standardized measure, while covariance is not.

5. Can the covariance inequality be negative?

Yes, the covariance inequality can be negative. A negative covariance indicates a negative relationship between two variables, meaning that as one variable increases, the other variable tends to decrease. However, the magnitude of the negative covariance can vary, and it is not as easy to interpret as a positive covariance.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
907
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
823
  • Advanced Physics Homework Help
Replies
5
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
0
Views
1K
  • Special and General Relativity
Replies
3
Views
887
  • Linear and Abstract Algebra
Replies
8
Views
871
  • Linear and Abstract Algebra
Replies
4
Views
966
  • Differential Geometry
Replies
14
Views
3K
  • Math Proof Training and Practice
Replies
33
Views
7K
Back
Top