Find Cov(Z,W) with E(X^2)if X is N(0,1)

  • Context: MHB 
  • Thread starter Thread starter WMDhamnekar
  • Start date Start date
Click For Summary
SUMMARY

The discussion focuses on calculating the covariance Cov(Z,W) where Z = 1 + X + XY² and W = 1 + X, with X and Y being independent standard normal variables, X ~ N(0,1). The final result is Cov(Z,W) = 2, derived from the variance and expected values of X and Y. The participants clarify that E(X²) = 1 due to the properties of the standard normal distribution, and they also discuss the variance of X², leading to insights about the chi-squared distribution.

PREREQUISITES
  • Understanding of covariance and variance in probability theory
  • Familiarity with properties of the normal distribution, specifically N(0,1)
  • Knowledge of expected values and their calculations
  • Basic concepts of chi-squared distribution
NEXT STEPS
  • Study the derivation of covariance for functions of random variables
  • Learn about the properties of the chi-squared distribution and its applications
  • Explore the relationship between variance and expected values in more complex scenarios
  • Investigate transformations of random variables, particularly squaring normal variables
USEFUL FOR

Statisticians, data scientists, and students of probability theory who are looking to deepen their understanding of covariance, variance, and the properties of normal and chi-squared distributions.

WMDhamnekar
MHB
Messages
378
Reaction score
30
Let X and Y be two independent \mathcal{N}(0,1) random variables and

Z=1+X+XY^2

W=1+X
I want to find Cov(Z,W).

Solution:-

Cov(Z,W)=Cov(1+X+XY^2,1+X)

Cov(Z,W)=Cov(X+XY^2,X)

Cov(Z,W)=Cov(X,X)+Cov(XY^2,X)

Cov(Z,W)=Var(X)+E(X^2Y^2)-E(XY^2)E(X)

Cov(Z,W)=1+E(X^2)E(Y^2)-E(X)^2E(Y^2)

Cov(Z,W)=1+1-0=2

Now E(X)=0, So E(X)^2E(Y^2)=0, But i don't follow how E(X^2)E(Y^2)=1? Would any member explain that? My another question is what is Var(X^2)?
 
Last edited:
Physics news on Phys.org
Re: E(x^2)and VAR(x^2)if X is N(0,1).

Dhamnekar Winod said:
My another question is what is Var(X^2)?

Hi Dhamnekar,

Let's start with this one.
It's the variance. It is the mean of the squared deviations from the average.
And the average of $X$ is the same thing as the expected value $E(X)$ or just $EX$.
In formula form:
$$\operatorname{Var}(X) = E\left((X - EX)^2\right)$$
If we write it out, we can find that it can be rewritten as:
$$\operatorname{Var}(X) = E(X^2) - (EX)^2$$

Dhamnekar Winod said:
Let X and Y be two independent \mathcal{N}(0,1) random variables and

(snip)

Now E(X)=0, So E(X)^2E(Y^2)=0, But i don't follow how E(X^2)E(Y^2)=1? Would any member explain that?

Now let's get back to your first question.

The fact that $X \sim \mathcal{N}(0,1)$ means that $\operatorname{Var}(X)=1$.
Combine it with $EX=0$ and fill it in:
$$\operatorname{Var}(X) = E(X^2) - (EX)^2 \implies 1=E(X^2)-0 \implies E(X^2)=1$$
 
Re: E(x^2)and VAR(x^2)if X is N(0,1).

Klaas van Aarsen said:
Hi Dhamnekar,

Let's start with this one.
It's the variance. It is the mean of the squared deviations from the average.
And the average of $X$ is the same thing as the expected value $E(X)$ or just $EX$.
In formula form:
$$\operatorname{Var}(X) = E\left((X - EX)^2\right)$$
If we write it out, we can find that it can be rewritten as:
$$\operatorname{Var}(X) = E(X^2) - (EX)^2$$
Now let's get back to your first question.

The fact that $X \sim \mathcal{N}(0,1)$ means that $\operatorname{Var}(X)=1$.
Combine it with $EX=0$ and fill it in:
$$\operatorname{Var}(X) = E(X^2) - (EX)^2 \implies 1=E(X^2)-0 \implies E(X^2)=1$$

Hello,
If $X$ be $\mathcal{N}(0,1)$ random variable, and $Y=X^2$ is the function of $X$, what is the distribution of $Y$?Is its distribution Normal?
 
Re: E(x^2)and VAR(x^2)if X is N(0,1).

Dhamnekar Winod said:
Hello,
If $X$ be $\mathcal{N}(0,1)$ random variable, and $Y=X^2$ is the function of $X$, what is the distribution of $Y$?

Is its distribution Normal?

No...

In probability theory and statistics, the chi-squared distribution (also chi-square or $χ^2$-distribution) with $k$ degrees of freedom is the distribution of a sum of the squares of $k$ independent standard normal random variables.
 

Similar threads

Replies
4
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 43 ·
2
Replies
43
Views
5K
  • · Replies 17 ·
Replies
17
Views
3K
Replies
1
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 24 ·
Replies
24
Views
3K
  • · Replies 14 ·
Replies
14
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K