Covariance Matrix of a Vector Random Variable w/ Components Related

In summary: Simplifying further, we get:V[X] = 3/4 + 3/4 + 3/4 + 3/4 = 9/4In summary, using the Total Variance Theorem and the definitions of conditional expectation and variance, we can calculate the covariance matrix KWW for the given equation X=ZU+Y.
  • #1
CAVision
3
0

Homework Statement


Given X=ZU+Y
where
(i) U,X,Y, and Z are random variables
(ii) U~N(0,1)
(iii) U is independent of Z and Y
(iv) f(z) = [itex]\frac{3}{4}[/itex] z2 if 1 [itex]\leq z[/itex] [itex]\leq 2 [/itex], f(z)=0 otherwise
(v) fY|Z=z(y) = ze-zy (i.e. Y depends conditionally on Z)
(vi) define vector W=[X Y Z]T

Q: What is the covariance matrix KWW?


Homework Equations


KWW = E[(W-E[W])(W-E[W])T]
KWW = [itex]\begin{pmatrix}K_{XX} & K_{XY} & K_{XZ} \\
K_{YX} & K_{YY} & K_{YZ} \\
K_{ZX} & K_{ZY} & K_{ZZ}
\end{pmatrix}
[/itex]

[itex]V[X] = E[(X-E[X])^2] = E[X^2] - E^2[X] [/itex]
[itex]V[X] = E_Z[V[X|Z=z]] + V_Z[E[X|Z=z]] [/itex] (Total Variance Theorem)

The Attempt at a Solution


I can calculate using [itex]K_{ZZ} = E[Z^2] - E^2[Z][/itex], and the definition of expectation.
However, I'm having difficulty calculating the other terms.

For example, if I use the Total Variance Theorem for KXX I get the following:
(using Total Variance Theorem)
[itex]V[X] = E_Z[V[X|Z=z]] + V_Z[E[X|Z=z]] [/itex]
(using X=ZU+Y)
[itex]V[X] = E_Z[V[ZU+Y|Z=z]] + V_Z[E[ZU+Y|Z=z]] [/itex]
(using Z=z)
[itex]V[X] = E_Z[V[zU+Y|Z=z]] + V_Z[E[zU+Y|Z=z]] [/itex]
(using V[aX+bY|Z=z]=a2V[X|Z=z]+b2V[Y|Z=z]+2abCov[X,Y|Z=z])
[itex]V[X] = E_Z[z^2V[U|Z=z] + V[Y|Z=z] + 2zCov[U,Y|Z=z]] +V_Z[E[zU+Y|Z=z]] [/itex]
(using E[aX+bY|Z=z]=aE[X|Z=z] +bE[Y|Z=z])
[itex]V[X] = E_Z[z^2V[U|Z=z] + V[Y|Z=z] + 2zCov[U,Y|Z=z]] +V_Z[zE[U|Z=z] + zE[Y|Z=z]] [/itex]
(using U,Z are independent, for U|Z=z terms drop the Z=z)
[itex]V[X] = E_Z[z^2V + V[Y|Z=z] + 2zCov[U,Y|Z=z]] +V_Z[zE + zE[Y|Z=z]] [/itex]
(using E= 0 and V=1 which is given by U~N(0,1) )
[itex]V[X] = E_Z[z^2 + V[Y|Z=z] + 2zCov[U,Y|Z=z]] +V_Z[0 + zE[Y|Z=z]] [/itex]

Now, I can find [itex]V[Y|Z=z][/itex] and [itex]E[Y|Z=z]] [/itex], but I'm not sure how to find Cov[U,Y|Z=z]].

From my understanding U independent of Y means Cov[U,Y]=0, but not necessarily that Cov[U,Y|Z=z] = 0. Let me know if this last statement is incorrect.

I'm I approaching this correctly?
Is there an easier way?
 
Physics news on Phys.org
  • #2


Your approach is correct so far. To find Cov[U,Y|Z=z], you can use the definition of conditional covariance:

Cov[X,Y|Z=z] = E[(X-E[X|Z=z])(Y-E[Y|Z=z])|Z=z]

Since U and Y are independent, E[U|Z=z] = 0 and E[Y|Z=z] = E[Y], so the above equation becomes:

Cov[U,Y|Z=z] = E[(U-0)(Y-E[Y])|Z=z]

Now, using the definition of expectation, we can rewrite this as:

Cov[U,Y|Z=z] = E[U(Y-E[Y])|Z=z]

And since U is independent of Y, we can pull U out of the expectation:

Cov[U,Y|Z=z] = E[U|Z=z]E[(Y-E[Y])|Z=z]

Again, since U is independent of Y, E[U|Z=z] = E = 0, so we are left with:

Cov[U,Y|Z=z] = 0

Therefore, using the Total Variance Theorem, we can simplify our equation for V[X] to:

V[X] = E_Z[z^2 + V[Y|Z=z]] + V_Z[zE[Y]]

Now, to find E[Y] and V[Y|Z=z], we can use the definition of conditional expectation and variance:

E[Y] = E[E[Y|Z=z]] = E[Z] = 3/4

V[Y|Z=z] = E[(Y-E[Y|Z=z])^2|Z=z] = E[(Y-3/4)^2|Z=z] = E[Y^2|Z=z] - (3/4)^2

Using the definition of expectation, we can find E[Y^2|Z=z]:

E[Y^2|Z=z] = ∫y^2ze-zy dy = z∫y^2e-zy dy = z(2/z^3) = 2/z^2

Therefore, V[Y|Z=z] = 2/z^2 - (3/4)^2

Now, we can plug these values into our equation for V[X] and simplify to get:

V[X] = 3/4 + z^2(2/z^2 - (3/4)^2) + z
 

What is a covariance matrix?

A covariance matrix is a square matrix that contains the variances and covariances of a set of random variables. It is used to describe the relationship between multiple variables and is an important tool in statistics and data analysis.

How is a covariance matrix calculated?

The covariance matrix is calculated by taking the covariance of each pair of variables in a dataset and placing them in a matrix. The diagonal elements of the matrix represent the variances of each variable, while the off-diagonal elements represent the covariances between variables.

What does the covariance matrix tell us?

The covariance matrix provides valuable information about the relationship between variables in a dataset. It can tell us how much two variables are related, whether they have a positive or negative relationship, and the strength of their relationship.

What is the difference between covariance and correlation?

Covariance and correlation are both measures of how two variables are related, but they have some key differences. Covariance measures the strength and direction of the linear relationship between two variables, while correlation measures the strength and direction of the linear relationship as well as the scale of the relationship.

How is the covariance matrix used in data analysis?

The covariance matrix is used in data analysis to understand the relationships between variables and to make predictions about future data. It is also used in statistical methods such as principal component analysis and linear regression to reduce the dimensionality of datasets and identify patterns in the data.

Similar threads

  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
8
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
964
  • Calculus and Beyond Homework Help
Replies
1
Views
608
  • Set Theory, Logic, Probability, Statistics
Replies
10
Views
1K
  • Calculus and Beyond Homework Help
Replies
8
Views
1K
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
  • Calculus and Beyond Homework Help
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
Replies
3
Views
2K
  • Calculus and Beyond Homework Help
Replies
12
Views
1K
Back
Top