Help with independent random variables and correlation

Click For Summary
SUMMARY

This discussion focuses on the independence and correlation properties of random variables X and Y, where X is a standard normal variable and Y is defined as Y = ZX, with Z being an independent random variable taking values +1 and -1 with equal probability. It is established that Y and Z are independent, Y is normally distributed with mean 0 and variance 1, and while X and Y are uncorrelated, they are not independent. The joint density of X and Y is derived, demonstrating that they are not jointly normally distributed.

PREREQUISITES
  • Understanding of normal distribution and properties of random variables
  • Knowledge of independence and correlation in probability theory
  • Familiarity with joint density functions
  • Basic calculus for evaluating integrals in probability
NEXT STEPS
  • Study the properties of independent random variables in probability theory
  • Learn about joint distributions and their implications for correlation
  • Explore the concept of conditional expectation in relation to independence
  • Investigate the implications of non-independence in statistical modeling
USEFUL FOR

Statisticians, data scientists, and students of probability theory who are interested in the nuances of correlation and independence among random variables.

ychu066
Messages
1
Reaction score
0
1 Let X be a normal variable with mean 0 and variance 1. Let Y = ZX
where Z and X are independent and Pr(Z = +1) = Pr(Z = -1) =1/2.

a Show that Y and Z are independent.
b Show that Y is also normal with mean 0 and variance 1.
c Show that X and Y are uncorrelated but dependent.
d Can you write down the joint density of X and Y ? Explain your
answer.

Note that this example exhibits two random variables which are un-
correlated, normally distributed, but not independent (and necessarily
not jointly normally distributed).

Please help me with the bold questions...
 
Physics news on Phys.org
a.)
\begin{align*}
Pr(Z=z \wedge Y=y) &= Pr(Z=z \wedge zX=y)= Pr(Z=z \wedge X=y/z)\\
&=Pr(Z=z)\cdot Pr(X=y/z) = Pr(Z=z)\cdot Pr(zX=y)=Pr(Z=z)\cdot Pr(Y=y)
\end{align*}
We used that ##X## and ##Z## are independent, and that ##Pr(Y)=Pr(zX)## for any fixed ##z\in\{\,-1,1\,\}##.

c.)
\begin{align*}
E(Y)=\int_\mathbb{R} y Pr(y)d\lambda_Y &= \sum_{z=\pm 1} \int_\mathbb{R} zx Pr(zx)d\lambda_X\\
&= -\dfrac{1}{2}\int_\mathbb{R} xd\lambda_X + \dfrac{1}{2}\int_\mathbb{R} xd\lambda_X = 0 = E(X)
\end{align*}
since ##Z## and ##X## are independent. And
\begin{align*}
E(XY)=\int_\mathbb{R} xy\,f(xy)d\lambda_{XY} = -\int_\mathbb{R} x^2\,f(-x^2)d\lambda_{X} + \int_\mathbb{R} x^2\,f(x^2)d\lambda_{X}=0
\end{align*}
since ##X## is symmetrically distributed at ##x=E(X)=0##.
\begin{align*}Pr(X=x\wedge Y=y)&=Pr(X=x\wedge Y=zx)=Pr(X=x\wedge X=y/z=x)\\&=Pr(X=x)=Pr(Y=y)\end{align*}
and ##Pr(X=x)Pr(Y=y)=Pr(X=x)^2 \neq Pr(X=x) = Pr(X=x\wedge Y=y)##
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 30 ·
2
Replies
30
Views
5K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
7
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K