Distribution, expected value, variance, covariance and correlation

In summary, the conversation discussed the distribution, expected value, and variance of the random variable $XY$, where $X$ is Bernoulli distributed and $Y$ and $Z$ are Poisson distributed. It also covered the calculation of the covariance and correlation between $XY$ and $XZ$. The final calculations showed that the variance and correlation depend on the success parameter $p$ and the parameters $\lambda$ and $\mu$ for the Poisson distributions.
  • #1
mathmari
Gold Member
MHB
5,049
7
Hey! :giggle:

Let $X$, $Y$ and $Z$ be independent random variables. Let $X$ be Bernoulli distributed on $\{0,1\}$ with success parameter $p_0$ and let $Y$ be Poisson distributed with parameter $\lambda$ and let $Z$ be Poisson distributed with parameter $\mu$.

(a) Calculate the distribution, the expected value and the variance of $XY$.

(b) Determine the Covariance and the correlation between $XY$ and $XZ$.
For question (a) :

We have that $$P(X=0)=1-p_0 \ \text{ and} \ P(X=1)=p_0$$ and $$P(Y=k)=\frac{\lambda^k}{k!}\cdot e^{-\lambda}$$

Sodo we get that $$P(XY=k)=P(XY=k|X=0)P(X=0)+P(XY=k|X=1)P(X=1)$$ If $k=0$ then \begin{align*}P(XY=0)&=P(XY=0|X=0)P(X=0)+P(XY=0|X=1)P(X=1)\\ & =1-p_0+e^{-\lambda}\end{align*} If $k\neq 0$ then \begin{align*}P(XY=k)&=P(XY=k|X=0)P(X=0)+P(XY=k|X=1)P(X=1)\\ & =0+\frac{\lambda^k}{k!}\cdot e^{-\lambda}\cdot p_0\\ & = \frac{\lambda^k}{k!}\cdot e^{-\lambda}\cdot p_0\end{align*}

Is that correct? :unsure:
 
Physics news on Phys.org
  • #2
mathmari said:
If $k=0$ then \begin{align*}P(XY=0)&=P(XY=0|X=0)P(X=0)+P(XY=0|X=1)P(X=1)\\ & =1-p_0+e^{-\lambda}\end{align*}
Hey mathmari!

Don't we have $P(XY=0|X=1)P(X=1) \,=\, P(Y=0)P(X=1) \,=\, e^{-\lambda}\cdot p_0$? :unsure:
 
  • #3
Klaas van Aarsen said:
Don't we have $P(XY=0|X=1)P(X=1) \,=\, P(Y=0)P(X=1) \,=\, e^{-\lambda}\cdot p_0$? :unsure:

Ahh yes! (Malthe)

Then the expected value is $E[XY]=E[X]\cdot E[Y]$ because they are independent, right?

About the variance we have $V(XY)=E((XY)^2)-(E[XY])^2$, but how do we calculate $E((XY)^2)$ ?

:unsure:
 
  • #4
mathmari said:
About the variance we have $V(XY)=E((XY)^2)-(E[XY])^2$, but how do we calculate $E((XY)^2)$ ?
We have $E\big((XY)^2\big) = E\big(X^2\cdot Y^2\big)$ and since $X$ and $Y$ are independent, $X^2$ and $Y^2$ will also be independent. 🤔
 
  • #5
Klaas van Aarsen said:
We have $E\big((XY)^2\big) = E\big(X^2\cdot Y^2\big)$ and since $X$ and $Y$ are independent, $X^2$ and $Y^2$ will also be independent. 🤔

Ahh ok! So it is $E[X^2]\cdot E[Y^2]$. But how are these factors defined? I got stuk right now.. Do we use the variance? :unsure:
 
  • #6
mathmari said:
Ahh ok! So it is $E[X^2]\cdot E[Y^2]$. But how are these factors defined? I got stuk right now.
Easiest is to look up the variances of the Bernoulli and Poison distributions and use those. 🤔
 
Last edited:
  • #7
Klaas van Aarsen said:
Easiest is to look up the variances of the Bernouli and Poison distributions and use those. 🤔

We have the following :
\begin{equation*}V(XY)=E((XY)^2)-(E[XY])^2= E\big(X^2\cdot Y^2\big)-\left (p\cdot \lambda\right )^2= E\big(X^2\big)\cdot E\big( Y^2\big)-p^2\cdot \lambda^2\end{equation*}
The variance of $X$ is $\text{Var}(X)=p(1-p)$ and \begin{align*}E(X^2)-(E(X))^2=p(1-p) &\Rightarrow E(X^2)=p(1-p)+(E(X))^2 \Rightarrow E(X^2)=p(1-p)+p^2 \\ & \Rightarrow E(X^2)=p-p^2+p^2\Rightarrow E(X^2)=p\end{align*}
The variance $Y$ is $\text{Var}(Y)=\lambda$ and \begin{equation*}E(Y^2)-(E(Y))^2=\lambda \Rightarrow E(Y^2)=\lambda+(E(Y))^2 \Rightarrow E(Y^2)=\lambda+\lambda^2 \end{equation*}
So we get \begin{equation*}V(XY)= E\big(X^2\big)\cdot E\big( Y^2\big)-p^2\cdot \lambda^2= p\cdot \left (\lambda+\lambda^2\right )-p^2\cdot \lambda^2= p\cdot \lambda+p\cdot\lambda^2-p^2\cdot \lambda^2= p\cdot \lambda+(p-p^2)\cdot \lambda^2\end{equation*}

:unsure:
 
  • #8
Looks correct to me. :unsure:
 
  • #9
Klaas van Aarsen said:
Looks correct to me. :unsure:

Great! For question (b) : The covariance is $\text{Cov}(XY, XZ)=E[(XY)(XZ)]-E[XY]E[XZ]$.

How do we calculate $E[(XY)(XZ)]$ ? :unsure:
 
  • #10
mathmari said:
How do we calculate $E[(XY)(XZ)]$ ?
Isn't it the same as $E[X^2\cdot Y \cdot Z]$? :unsure:
 
  • #11
Klaas van Aarsen said:
Isn't it the same as $E[X^2\cdot Y \cdot Z]$? :unsure:

Ah and this is equal to $E[X^2]\cdot E[Y] \cdot [Z]$ because these are independent random variables, right? :unsure:
 
  • #12
mathmari said:
Ah and this is equal to $E[X^2]\cdot E[Y] \cdot [Z]$ because these are independent random variables, right?
Yep. (Nod)
 
  • #13
Klaas van Aarsen said:
Yep. (Nod)

So we have the following:

The covariance of $XY$ and $XZ$ is \begin{align*}\text{Cov}(XY, XZ)&=E[(XY)(XZ)]-E[XY]E[XZ]=E[X^2\cdot Y \cdot Z]-p\cdot \lambda\cdot p\cdot \mu=E[X^2]\cdot E[Y] \cdot E[Z]-p\cdot \lambda\cdot p\cdot \mu\\ & =p\cdot \lambda \cdot \mu-p^2\cdot \lambda\cdot \mu=(p-p^2)\cdot \lambda \cdot \mu\end{align*}
The correlation of $XY$ and $XZ$ is \begin{align*}\rho _{XY,XZ}={\frac {\operatorname {Cov} (XY,XZ)}{{\sqrt {\operatorname {Var} (XY)}}{\sqrt {\operatorname {Var} (XZ)}}}}={\frac {(p-p^2)\cdot \lambda \cdot \mu}{{\sqrt {p\cdot \lambda+(p-p^2)\cdot \lambda^2}}{\sqrt {p\cdot \mu+(p-p^2)\cdot \mu^2}}}}\end{align*}

Is everything correct? :unsure:
 
  • #14
It looks correct to me. :unsure:
 
  • #15
Klaas van Aarsen said:
It looks correct to me. :unsure:

Great! Thank you! (Sun)
 

1. What is the difference between distribution and expected value?

Distribution refers to the spread or range of values that a set of data can take on. Expected value, on the other hand, is the average value that is expected to be obtained from a random variable based on its probability distribution. In simpler terms, distribution shows the possible outcomes while expected value shows the most likely outcome.

2. How is variance related to expected value?

Variance is a measure of how spread out a set of data is from its expected value. It is calculated by taking the average of the squared differences between each data point and the expected value. In other words, variance shows how much the data varies from the expected value.

3. What is covariance and how is it different from correlation?

Covariance is a measure of how two variables change together. It shows the direction of the relationship between two variables, whether they move in the same direction (positive covariance) or opposite directions (negative covariance). Correlation, on the other hand, is a standardized measure of the strength of the relationship between two variables. It takes into account the units of the variables and always ranges from -1 to 1, with 0 indicating no relationship.

4. How can covariance and correlation be used in data analysis?

Covariance and correlation are useful in data analysis as they help to identify patterns and relationships between variables. They can also be used to determine the strength and direction of the relationship between two variables, which can be helpful in making predictions or drawing conclusions from the data.

5. Can correlation imply causation?

No, correlation does not imply causation. Just because two variables have a strong correlation does not mean that one causes the other. It is important to consider other factors and conduct further research before making any causal claims based on correlation alone.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
855
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
3K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
818
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
958
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
923
  • Set Theory, Logic, Probability, Statistics
Replies
0
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
918
Replies
0
Views
351
  • Set Theory, Logic, Probability, Statistics
Replies
25
Views
2K
Back
Top