Independence of Random Variables

In summary: But it does not have to be a rectangle; it can be a triangle, or some other more complicated shape. Another example is to take two independent random variables ##U, V## that are uniformly distributed on the triangle with vertices ##(0,0), (2,0), (0,1)##. Then let ##X = -U## and ##Y = 2U+V##, and check that ##X, Y## are independent. (The joint density ##f_{X,Y}(x,y)## is easy to draw graphically; it is zero outside that triangle.)
  • #1
showzen
34
0

Homework Statement


Given ##f_{X,Y}(x,y)=2e^{-x}e^{-y}\ ;\ 0<x<y\ ;\ y>0##,
The following theorem given in my book (Larsen and Marx) doesn't appear to hold.

Homework Equations


Definition
##X## and ##Y## are independent if for every interval ##A## and ##B##, ##P(X\in A \land Y\in B) = P(X\in A)P(Y\in B) ##.
Theorem
##X## and ##Y## are independent iff ##f_{X,Y}(x,y)=g(x)h(y)##.
If so, there is a constant ##k## such that ##f_X(x)=kg(x)## and ##f_Y(y)=(1/k)h(y)##.

The Attempt at a Solution


Consider ##g(x)=2e^{-x}## and ##h(y)=e^{-y}##. Then, ##f_{X,Y}(x,y)=g(x)h(y)##, therefore theorem indicates that ##X## and ##Y## are independent.
The constant ##k## is
##k=\int_0^\infty h(y)dy=1##
##k=\int_0^y g(x)dx = 2(1-e^{-y})##
There is a contradiction in the value of k and it is not constant.

Am I missing something, or is the theorem incomplete in that it is lacking details on the intervals that the random variables are defined on?
 
Physics news on Phys.org
  • #2
The theorem is correct, you should have: ##f_X(x) = \int_x^\infty f_{X,Y}(x,y)dy## and ##f_Y(y) = \int_0^y f_{X,Y}(x,y)dx##.
 
  • #3
showzen said:

Homework Statement


Given ##f_{X,Y}(x,y)=2e^{-x}e^{-y}\ ;\ 0<x<y\ ;\ y>0##,
The following theorem given in my book (Larsen and Marx) doesn't appear to hold.

Homework Equations


Definition
##X## and ##Y## are independent if for every interval ##A## and ##B##, ##P(X\in A \land Y\in B) = P(X\in A)P(Y\in B) ##.
Theorem
##X## and ##Y## are independent iff ##f_{X,Y}(x,y)=g(x)h(y)##.
If so, there is a constant ##k## such that ##f_X(x)=kg(x)## and ##f_Y(y)=(1/k)h(y)##.

The Attempt at a Solution


Consider ##g(x)=2e^{-x}## and ##h(y)=e^{-y}##. Then, ##f_{X,Y}(x,y)=g(x)h(y)##, therefore theorem indicates that ##X## and ##Y## are independent.
The constant ##k## is
##k=\int_0^\infty h(y)dy=1##
##k=\int_0^y g(x)dx = 2(1-e^{-y})##
There is a contradiction in the value of k and it is not constant.

Am I missing something, or is the theorem incomplete in that it is lacking details on the intervals that the random variables are defined on?

You are definitely missing something: your "product" formula for ##f_{X,y}(x,y) ## holds only over the region ##0 < x < y##. (Presumably, ##f_{X,Y}## is zero outside that region, but the question does not say that.) Anyway, a bivariate density for a pair of independent random variables would need to be non-zero on the whole of the bivariate space ## 0 < x,y < \infty##.

To see explicitely that ##X,Y## are NOT independent, compute the marginal densities
$$ f_X(x) = \int_{-\infty}^{\infty} f_{X,Y}(x,y) \, dy = \int_x^{\infty} e^{-x} e^{-y} \, dy$$
and
$$f_Y(y) = \int_{-\infty}^{\infty} f_{X,Y}(x,y) \, dx =\int_0^y e^{-x} e^{-y} \, dx .$$
Finish the calculations and then check if ##f_X(x) f_Y(y)## agrees with your ##f_{X,Y} (x,y)##
 
Last edited:
  • #4
##f_X(x) = 2\int_x^\infty e^{-x}e^{-y}dy = 2e^{-2x}##
##f_Y(y) = 2\int_0^y e^{-x}e^{-y}dx = 2e^{-y}(1-e^{-y})##
##f_X(x)f_Y(y) = 4e^{-2x}e^{-y}(1-e^{-y}) \neq f_{X,Y}(x,y)##
So we can show explicitly that they are not independent...
But we can separate ##f_{X,Y}(x,y)## into a product of ##g(x)## and ##h(y)## which implies independence by the theorem?
 
  • #5
showzen said:
##f_X(x) = 2\int_x^\infty e^{-x}e^{-y}dy = 2e^{-2x}##
##f_Y(y) = 2\int_0^y e^{-x}e^{-y}dx = 2e^{-y}(1-e^{-y})##
##f_X(x)f_Y(y) = 4e^{-2x}e^{-y}(1-e^{-y}) \neq f_{X,Y}(x,y)##
So we can show explicitly that they are not independent...
But we can separate ##f_{X,Y}(x,y)## into a product of ##g(x)## and ##h(y)## which implies independence by the theorem?

No, no, no! Absolutely not! Your ##f_{X,Y}(x,y)## is NOT a product of some ##g(x)## and some ##h(y)## over the whole quarter plane ##0 < x, y < \infty##. It is a product, but only over the restricted region ##0 < x < y < \infty##, and that fact makes ##X## and ##Y## very definitely not independent. In fact, you should do what I suggested in my first response, which is to compute the marginals ##f_X(x)## and ##f_Y(y)##. You can use these to compute ##P(X \in A)## and ##P(Y \in B)## for ##A = \{ 0 < x < 1 \}## and ##B= \{ 0 < y < 2 \}##. Now compute ##P(X \in A) \cdot P(Y \in B)##. Finally, compute ##P(X \in A \; \& \; Y \in B)## by integrating ##f_{X,Y}(x,y)## over the appropriate 2-dimensional region. Do you get different answers? (I do.)
 
  • #6
Yes, the answers are different. Can ##X## and ##Y## only be independent if they are defined over a rectangle?
 
  • #7
showzen said:
Yes, the answers are different. Can ##X## and ##Y## only be independent if they are defined over a rectangle?

Basically, yes, and for obvious reasons: you cannot make ##g(x) h(y)## come out at zero over a part of the ##(x,y)## plane where both ##g(x) \neq 0## and ##h(y) \neq 0##.
 
  • Like
Likes showzen

What is the definition of independence of random variables?

Independence of random variables refers to the concept that the occurrence or value of one random variable does not affect the occurrence or value of another random variable.

How is the independence of random variables determined?

The independence of random variables can be determined by calculating the joint probability of the variables and comparing it to the product of their individual probabilities. If the joint probability is equal to the product of their individual probabilities, then the variables are considered independent.

What is the significance of independence of random variables in statistics?

Independence of random variables is important in statistics because it allows for simpler analysis and calculations. When variables are independent, their joint probability can be easily calculated and used to make predictions.

Can two dependent random variables be considered independent?

No, if two random variables are dependent, meaning the occurrence or value of one variable does affect the occurrence or value of the other, then they cannot be considered independent.

What is the difference between independence and correlation of random variables?

Independence and correlation are related but distinct concepts. Independence refers to the lack of relationship between two random variables, while correlation refers to the strength and direction of the relationship between two variables.

Similar threads

  • Calculus and Beyond Homework Help
Replies
19
Views
759
  • Calculus and Beyond Homework Help
Replies
8
Views
347
  • Calculus and Beyond Homework Help
Replies
5
Views
499
  • Calculus and Beyond Homework Help
Replies
5
Views
192
  • Calculus and Beyond Homework Help
Replies
3
Views
466
  • Calculus and Beyond Homework Help
Replies
8
Views
709
  • Calculus and Beyond Homework Help
Replies
1
Views
975
  • Calculus and Beyond Homework Help
Replies
20
Views
384
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
151
Back
Top