Check my proof on showing two Bernoulli RV's are independent

Click For Summary
SUMMARY

This discussion confirms that two Bernoulli random variables, X and Y, are independent if and only if the joint probability P(X=1, Y=1) equals the product of their individual probabilities P(X=1)P(Y=1). The proof provided demonstrates that independence is equivalent to zero covariance, as shown by the equation Cov(X,Y) = E(XY) - E(X)E(Y). The participant references a source from arXiv to support their argument and seeks clarification on whether summing marginal probability mass functions (pmfs) is necessary for establishing independence.

PREREQUISITES
  • Understanding of Bernoulli random variables
  • Knowledge of probability mass functions (pmfs)
  • Familiarity with covariance and correlation concepts
  • Basic grasp of mathematical proofs in probability theory
NEXT STEPS
  • Study the properties of Bernoulli distributions and their applications
  • Learn about covariance and its implications in probability theory
  • Investigate the role of marginal and joint distributions in determining independence
  • Review mathematical proofs related to independence of random variables
USEFUL FOR

Statisticians, data scientists, and students of probability theory who are looking to deepen their understanding of random variable independence and covariance concepts.

Runty_Grunty
Messages
5
Reaction score
0
I've got a pretty good answer to this one already, yet I'd like to see how solid it is. I'll list the question first in quotes.
Show that two Bernoulli random variables X and Y are independent if and only if P(X=1,Y=1)=P(X=1)P(Y=1).

Here's my work below. I credit http://arxiv.org/PS_cache/arxiv/pdf/0909/0909.1685v4.pdf" for the answer.

X and Y are independent if and only if P(X=i,Y=j)=P(X=i)P(Y=j) where i,j=0,1.
Two Bernoulli random variables are independent if and only if they are uncorrelated, and thus have a covariance of zero.
Corr(X,Y)=0\Leftrightarrow Cov(X,Y)=0

Let p(x) be the pmf of X, and let p(y) be the pmf of Y.
If X and Y are independent then by definition
Cov(X,Y)=p(xy)-p(x)p(y)=P(X=i,Y=j)-P(X=i)P(Y=j)=0,
as P(X=i,Y=j)=P(X=i)P(Y=j) for i,j=0,1.
If on the other hand we have that Cov(X,Y)=0, then
p(xy)-p(x)p(y)=0\Rightarrow p(xy)=p(x)p(y).
Therefore, X and Y are independent.

This should properly answer the question, though I've been told by another source that summing up the marginal pmfs is also necessary to show independence. I don't know whether or not that's really necessary, though, and could use a second opinion.

Is there anything about my proof that could use improvement?
 
Last edited by a moderator:
Physics news on Phys.org
Note:

\begin{align*}<br /> cov(X,Y) &amp; = E(XY) - E(X)E(Y) \\<br /> &amp;= \left(0 \cdot 0 \cdot p(0,0) + 0 \cdot 1 \cdot p(0,1) + 1 \cdot 0 \cdot p(1,0) + 1 \cdot 1 \cdot p(1,1)\right) - \left(0 p_x(0) + 1 p_x(1)\right) \left(0 \cdot p_y(0) + 1 \cdot p_y(1)\right) \\<br /> &amp;= p(1,1) - p_x(1)p_y(1) \tag{A}<br /> \end{align*}<br />

I think this relates to the "summing" comment. Also, nothing in the above calculation is based on an assumption of independence or dependence. From (A) you can say:

If X, Y are independent, then ...

and then

If cov(X,Y) = 0 it must be true that ... so ...
 
statdad said:
Note:

\begin{align*}<br /> cov(X,Y) &amp; = E(XY) - E(X)E(Y) \\<br /> &amp;= \left(0 \cdot 0 \cdot p(0,0) + 0 \cdot 1 \cdot p(0,1) + 1 \cdot 0 \cdot p(1,0) + 1 \cdot 1 \cdot p(1,1)\right) - \left(0 p_x(0) + 1 p_x(1)\right) \left(0 \cdot p_y(0) + 1 \cdot p_y(1)\right) \\<br /> &amp;= p(1,1) - p_x(1)p_y(1) \tag{A}<br /> \end{align*}<br />

I think this relates to the "summing" comment. Also, nothing in the above calculation is based on an assumption of independence or dependence. From (A) you can say:

If X, Y are independent, then ...

and then

If cov(X,Y) = 0 it must be true that ... so ...

Thanks, I was wondering what was meant by that "summing" part.
 
If there are an infinite number of natural numbers, and an infinite number of fractions in between any two natural numbers, and an infinite number of fractions in between any two of those fractions, and an infinite number of fractions in between any two of those fractions, and an infinite number of fractions in between any two of those fractions, and... then that must mean that there are not only infinite infinities, but an infinite number of those infinities. and an infinite number of those...

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 14 ·
Replies
14
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 0 ·
Replies
0
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K