If X,Y,Z mutually independent, is X*Y and X*Z independent?

In summary: Think about if a = xy and b = xz are dependent. a/y = x which implies b = a/y x z = az/y. which means b depends on a just like the situation where in y = x + 2, y depends on x.
  • #1
zzzhhh
40
1
Supposing X, Y and Z are three mutually independent continuous random variable, is multiplication X*Y and X*Z still independent? If yes, please prove it; if no, please come up with a counterexample. Thanks a lot!
 
Physics news on Phys.org
  • #2
zzzhhh said:
Supposing X, Y and Z are three mutually independent continuous random variable, is multiplication X*Y and X*Z still independent? If yes, please prove it; if no, please come up with a counterexample. Thanks a lot!

If you are wondering whether A = X*Y and B = X*Z are independent, the answer is no and the reason is that the R.V. X is in both distributions in terms of the simple explanation. If you want to prove this, consider proving the covariance relationship for A and B.

If you are wondering about the PDF of A and B, then you can use the fact that P(X and Y) = P(X)P(Y) for A and P(X AND Z) = P(X)P(Z) for B and that is ok if X,Y,Z are all independent of each other.
 
  • #3
chiro said:
If you are wondering whether A = X*Y and B = X*Z are independent, the answer is no and the reason is that the R.V. X is in both distributions in terms of the simple explanation. If you want to prove this, consider proving the covariance relationship for A and B.

If you are wondering about the PDF of A and B, then you can use the fact that P(X and Y) = P(X)P(Y) for A and P(X AND Z) = P(X)P(Z) for B and that is ok if X,Y,Z are all independent of each other.

can't get it. what does "in terms of the simple explanation" mean? what does "that is ok" mean to get the joint pdf of A and B?
 
  • #4
zzzhhh said:
can't get it. what does "in terms of the simple explanation" mean? what does "that is ok" mean to get the joint pdf of A and B?

If you are wanting to know whether A is dependent on B the answer is yes, and the simple answers is that the same random variable X is in both A and B. Again if you want to prove that A and B are dependent, show that the covariance is non-zero in general and you're done. However there is one-caveat: a zero covariance doesn't imply independent but a non-zero covariance definitely rules out independence. Also the other way (which along the same lines) is to show E[AB] <> E[A]E and if this is shown then the variables are dependent.

The joint pdf of A can be written as f(X)f(Y) and B can be written as f(X)f(Z) where f(X),f(Y),f(Z) are the PDF's of X, Y, and Z respectively only because they are independent.

However you need to remember that if you want expectation and variance you need to find E[XY] and VAR[XY] for A and E[XZ] and VAR[XZ] for B instead of E[X+Y] for example.

What I mean by the simple explanation is that if two different random variables contain one random variable in them both that refers to the same random variable, then they are dependent.

Instead of thinking about them as random variables, think about them as real numbers. Think about if a = xy and b = xz are dependent. a/y = x which implies b = a/y x z = az/y. which means b depends on a just like the situation where in y = x + 2, y depends on x.

Remember that in a random variable, you will get a realization and these realizations correspond to having the situation above where the random variable actually takes on a specific value. Although the realization will change every time according to the distribution, it is still the same idea.
 
  • #5
chiro said:
If you are wanting to know whether A is dependent on B the answer is yes, and the simple answers is that the same random variable X is in both A and B. Again if you want to prove that A and B are dependent, show that the covariance is non-zero in general and you're done. However there is one-caveat: a zero covariance doesn't imply independent but a non-zero covariance definitely rules out independence. Also the other way (which along the same lines) is to show E[AB] <> E[A]E and if this is shown then the variables are dependent.

The joint pdf of A can be written as f(X)f(Y) and B can be written as f(X)f(Z) where f(X),f(Y),f(Z) are the PDF's of X, Y, and Z respectively only because they are independent.

However you need to remember that if you want expectation and variance you need to find E[XY] and VAR[XY] for A and E[XZ] and VAR[XZ] for B instead of E[X+Y] for example.

What I mean by the simple explanation is that if two different random variables contain one random variable in them both that refers to the same random variable, then they are dependent.

Instead of thinking about them as random variables, think about them as real numbers. Think about if a = xy and b = xz are dependent. a/y = x which implies b = a/y x z = az/y. which means b depends on a just like the situation where in y = x + 2, y depends on x.

Remember that in a random variable, you will get a realization and these realizations correspond to having the situation above where the random variable actually takes on a specific value. Although the realization will change every time according to the distribution, it is still the same idea.


No, b=az/y is not like the situation where in y = x + 2, y depends on x because 2 is a constant while z/y is a random variable. Considering the situation y=x+w, where all variables are independently random, does y depend on x? I think no, because y can take whatever value no matter what x is. The same is true of b=az/y.
 
  • #6
zzzhhh said:
No, b=az/y is not like the situation where in y = x + 2, y depends on x because 2 is a constant while z/y is a random variable. Considering the situation y=x+w, where all variables are independently random, does y depend on x? I think no, because y can take whatever value no matter what x is. The same is true of b=az/y.

No you're missing the point.

The random variables at some point get realized: in other words they end up taking a known value and that value corresponds to the situation above.

Remember that for X,Y,Z you get what is called a realization (call it x,y,z for these random variables) and these realizations are just numbers. The probability of getting these numbers if we repeat the process depends on the actual distribution for P(X), P(Y), P(Z) but at every step of the process, the x's in a = xy and b = xz are the same x's. This is why they are dependent.

If you don't understanding think about the first time you get a realization of X, Y, Z. For X call it x. For Y call it y and For Z call it z. Now for A call it a and for B call it b. They are all realizations for one particular realization and for the next realization they will all most likely be different, but the fact remains that they just numbers for each realization.

Remember that a random variable has to be realized and a realization means that when the process is executed, the realization is just a number.
 
  • #7
chiro said:
No you're missing the point.

The random variables at some point get realized: in other words they end up taking a known value and that value corresponds to the situation above.

Remember that for X,Y,Z you get what is called a realization (call it x,y,z for these random variables) and these realizations are just numbers. The probability of getting these numbers if we repeat the process depends on the actual distribution for P(X), P(Y), P(Z) but at every step of the process, the x's in a = xy and b = xz are the same x's. This is why they are dependent.

If you don't understanding think about the first time you get a realization of X, Y, Z. For X call it x. For Y call it y and For Z call it z. Now for A call it a and for B call it b. They are all realizations for one particular realization and for the next realization they will all most likely be different, but the fact remains that they just numbers for each realization.

Remember that a random variable has to be realized and a realization means that when the process is executed, the realization is just a number.

"This is why they are dependent." I still don't know why if a and b have common factor x then they are dependent (in terms of rigorous definition appeared in all probability theory textbook).

OK, mathematics is not literature or philosophy. If you think A and B may be dependent, please come up with a concrete rigorous example as my original post says. Thanks!
 
  • #8
I can here come up with an example of independence. we know sample mean and sample variance are independent for normal population. These two guys have many things in common -- all random samples, actually the definition of sample variance even contains sample mean, but they are indepemdent.
 
  • #9
To be somewhat more specific, suppose X, Y and Z are all mutually independent random variables from uniform distribution on (0,1). I can find out that A=XY has a pdf f(a)=-ln a if 0<a<1 and 0 elsewhere, and so does B=XZ. Can you tell me what's the joint pdf of A and B f_AB(a,b), and show next that it does not equal f_A(a)*f_B(b) at some point (a,b) on the plane?
 
  • #10
zzzhhh said:
"This is why they are dependent." I still don't know why if a and b have common factor x then they are dependent (in terms of rigorous definition appeared in all probability theory textbook).

OK, mathematics is not literature or philosophy. If you think A and B may be dependent, please come up with a concrete rigorous example as my original post says. Thanks!

This means that you don't really understand what a random variable is and what a realization is.

A random variable represents the probability of either a particular value being 'realized' or an interval of values being 'realized' for a discrete and continuous distribution.

The realization is an actual value of some sort. It is usually a one-dimensional number. When you actually measure something you get a value of some sort.

Now again for X,Y,Z at each 'realization' of X, Y, and Z, you have the realizations corresponding to x,y,z. This is the intuitive answer. If you do not understand this, then again you don't understand what random variables really are.

The rigorous definition is as follows: show E[AB] != E[A]E

We will use the covariance formula which gives us:

COV(A,B) = E(A - E[A])E(B - E) = E[XY - E[XY]]E[XZ - E[XZ]]
= E[XYXZ] - XYE[XZ] - E[XY]E[XZ] + E[XY]E[XZ]]
= E[XZXY] - E[XY]E[XZ] - E[XY]E[XZ] + E[XY]E[XZ]
= E[X^2YZ] - E[XY]E[XZ]

Now For E[XY]E[XZ] = E[X^2YZ] this means that XY has to be independent from XZ. But we can see it another way:
E[YZ]E[X^2]
= E[XY]E[XZ]
= E[X]E[X]E[YZ]
= E[X]E[X]E[Y]E[Z]
= E[X^2]E[Y]E[Z] if you have complete independence.

Remember that if you have independence, them E[XY] = E[X]E[Y] and all I am doing is partitioning A and B down to the simplest factors: X,Y and Z.

But this implies that E[X^2] = E[X]^2 and this is not the case in general: this shows that if this doesn't hold you have dependence. This is the proof.

Again though, it is a lot better to have some kind of intuition for what is going on and while I agree that proofs are important, I think that they can be nearly useless if you don't really know what is going on because it means you are trusting in something without really understanding it.

As an example what you should do is take three coins related to X, Y, and Z. For each step, flip the three coins, write down the values (1 or 2 for Tails or Heads) and then for each step of 3 tosses write down A and B and look at what you get for each toss: you will see that for every toss if a =xy and b = xz then b = az/y for every single set of tosses. This will really help you what I meant in earlier posts. If you aren't convinced: for every step do three tosses record the x,y,z values for each set of tosses can calculate a,b and the formula for b in terms of a and see what you get.
 
  • #11
Also remember if E[X^2] = E[X]^2 then VAR[X] = E[X^2] - E[X]^2 = 0. So the only time this can hold is if you have a deterministic number. This proves the result for the general case.
 
  • #12
chiro said:
This means that you don't really understand what a random variable is and what a realization is.

A random variable represents the probability of either a particular value being 'realized' or an interval of values being 'realized' for a discrete and continuous distribution.

The realization is an actual value of some sort. It is usually a one-dimensional number. When you actually measure something you get a value of some sort.

Now again for X,Y,Z at each 'realization' of X, Y, and Z, you have the realizations corresponding to x,y,z. This is the intuitive answer. If you do not understand this, then again you don't understand what random variables really are.

The rigorous definition is as follows: show E[AB] != E[A]E

We will use the covariance formula which gives us:

COV(A,B) = E(A - E[A])E(B - E) = E[XY - E[XY]]E[XZ - E[XZ]]
= E[XYXZ] - XYE[XZ] - E[XY]E[XZ] + E[XY]E[XZ]]
= E[XZXY] - E[XY]E[XZ] - E[XY]E[XZ] + E[XY]E[XZ]
= E[X^2YZ] - E[XY]E[XZ]

Now For E[XY]E[XZ] = E[X^2YZ] this means that XY has to be independent from XZ. But we can see it another way:
E[YZ]E[X^2]
= E[XY]E[XZ]
= E[X]E[X]E[YZ]
= E[X]E[X]E[Y]E[Z]
= E[X^2]E[Y]E[Z] if you have complete independence.

Remember that if you have independence, them E[XY] = E[X]E[Y] and all I am doing is partitioning A and B down to the simplest factors: X,Y and Z.

But this implies that E[X^2] = E[X] and this is not the case in general: this shows that if this doesn't hold you have dependence. This is the proof.

Again though, it is a lot better to have some kind of intuition for what is going on and while I agree that proofs are important, I think that they can be nearly useless if you don't really know what is going on because it means you are trusting in something without really understanding it.

As an example what you should do is take three coins related to X, Y, and Z. For each step, flip the three coins, write down the values (1 or 2 for Tails or Heads) and then for each step of 3 tosses write down A and B and look at what you get for each toss: you will see that for every toss if a =xy and b = xz then b = az/y for every single set of tosses. This will really help you what I meant in earlier posts. If you aren't convinced: for every step do three tosses record the x,y,z values for each set of tosses can calculate a,b and the formula for b in terms of a and see what you get.


You are just listing some formulas in a textbook. It is neither a rigorous proof nor a counterexample. I think mathematical question is the most clear thing in the world -- it is just either true, if you can prove it, or false, if you can give a counterexample. Any effort to turn mathematical problem into literature or philosophy is meaningless. So, thank you for your good will to help, but I think you can not answer my question, please leave it to other people who can answer. Thanks!
 
  • #13
zzzhhh said:
You are just listing some formulas in a textbook. It is neither a rigorous proof nor a counterexample. I think mathematical question is the most clear thing in the world -- it is just either true, if you can prove it, or false, if you can give a counterexample. Any effort to turn mathematical problem into literature or philosophy is meaningless. So, thank you for your good will to help, but I think you can not answer my question, please leave it to other people who can answer. Thanks!

Proving E[X]E[Y] <> E[XY] is a rigorous proof that X is not independent to Y. It is the definition that if X and Y are independent then E[XY] - E[X]E[Y] = 0 period. This is rigorous.

I've already proven this above that the only case where your situation can hold for X is if VAR[X] = 0. Re-read it again if you have to.

The above is a proof that if the above condition holds then independence must be ruled out (but you can't necessarily show that if it holds then it has to be independent: it's a one way implication).

Again even if you need this proof, you seem to have absolutely no intuition whatsoever what is even going on and if this is the case then you won't really understand anything about probability or statistics.

Again look at the proof above and how I create the various partitions of X^2YZ and what has to be satisfied in terms of 0 covariance.
 
  • #14
chiro said:
Proving E[X]E[Y] <> E[XY] is a rigorous proof that X is not independent to Y. It is the definition that if X and Y are independent then E[XY] - E[X]E[Y] = 0 period. This is rigorous.

I've already proven this above that the only case where your situation can hold for X is if VAR[X] = 0. Re-read it again if you have to.

The above is a proof that if the above condition holds then independence must be ruled out (but you can't necessarily show that if it holds then it has to be independent: it's a one way implication).

Again even if you need this proof, you seem to have absolutely no intuition whatsoever what is even going on and if this is the case then you won't really understand anything about probability or statistics.

Again look at the proof above and how I create the various partitions of X^2YZ and what has to be satisfied in terms of 0 covariance.

please check you proof again. how did E[XY]E[XZ] = E[X^2YZ] imply E[X^2] = E[X]?
 
  • #15
zzzhhh said:
please check you proof again. how did E[XY]E[XZ] = E[X^2YZ] imply E[X^2] = E[X]?

It's a recursive decomposition. Also it's E[X^2] = E[X]E[X] = E[X]^2.

Remember that you have to have independence for every single decomposition. If I have two random variables X and Y I only have one decomposition which is E[X]E[Y] in terms of X and Y but if I have X,Y,Z then I can decompose it in more than one way.

As an example E[XYZ] = E[X]E[YZ] = E[XY]E[Z] = E[XZ]E[Y] = E[X]E[Y]E[Z] must hold if the pairs (XY,Z), (XZ,Y) are independent and then for the pairs (X,Y) and (X,Z).

So what I have done is chosen to let F = X^2 and G = YZ. Now you have E[F]E[G] = E[X^2]E[YZ] = E[X^2]E[Y]E[Z] but if you choose another decomposition you have E[XY]E[YZ] = E[X]E[Y]E[Y]E[Z] = E[X]E[X]E[Y]E[Z]. This implies that for independence E[X^2]E[Y]E[Z] = E[X]E[X]E[Y]E[Z] which implies E[X^2] = E[X]E[X].

Now Var(X) = E[X^2] - E[X]^2, but E[X^2] = E[X]E[X] which implies Var(X) must be zero for this to hold which means that X is just a constant if this property holds. If X is not a constant and has non-zero variance, then the equality doesn't hold which implies that the two variables A and B are not independent.

Think of say some variable F = XYZ = X(YZ) = (XY)Z = (X)(Y)(Z) = X((Y)(Z)) = ((X)(Y))Z and this should give you the picture.
 
  • #16
@ chiro:
Note that in general: [itex]E(X^2)≠(E(X))^2[/itex]

@ zzzhhh:
The proof is a simple application of the definition of covariance and the properties of independent random variables. The question should be in the Homework forum.

Proof:
[itex]Cov(A,B)≠0 \Rightarrow[/itex] A and B are dependent, so prove that in general it is not true that [itex]Cov(A,B)=0[/itex].
[itex]Cov(A,B)=E(AB)-E(A)E(B)=E(X^2YZ)-E(XY)E(XZ)=E(X^2)E(Y)E(Z)-(E(X))^2E(Y)E(Z)=(E(X^2)-(E(X))^2)E(Y)E(Z)\Rightarrow[/itex]
Since [itex]E(X^2)≠(E(X))^2[/itex] in general, the expression above does not in general equal to zero, which is what we wanted to prove.
 
Last edited:
  • #17
oleador said:
Proof:
[itex]Cov(A,B)≠0 \Rightarrow[/itex] A and B independent,

?? Where from? Cov[A,B]<>0 certainly implies A & B are dependent.
 
  • #18
ssd said:
?? Where from? Cov[A,B]<>0 certainly implies A & B are dependent.

Sorry, that's a typo. Obviously, I meant "dependent".
Edited the post.
 
  • #19
Obvious typo. :)
 
  • #20
oleador said:
Since [itex]E(X^2)≠(E(X))^2[/itex] in general, the expression above does not in general equal to zero, which is what we wanted to prove.

Correct deduction.
E[X^2]= (E[X])^2 can only happen when V(X)=0, i.e. X is a constant.
 
  • #21
oleador said:
@ chiro:
Note that in general: [itex]E(X^2)≠(E(X))^2[/itex]

You don't have to convince me, that's what I've been saying all along.
 
  • #22
chiro said:
You don't have to convince me, that's what I've been saying all along.

Ok, perhaps I misread your proof.
 
  • #23
Without doing any real math, there is an easy counterexample when X takes the value 0 with probability 1
 
  • #24
Office_Shredder said:
Without doing any real math, there is an easy counterexample when X takes the value 0 with probability 1

This is just one example of a random variable having zero variance (which has been stated previously). In fact you can have a random variable take any finite value with a probability 1 and still have the result work.
 

1. What does it mean for variables to be mutually independent?

When variables are mutually independent, it means that the occurrence of one variable does not affect the occurrence of another. In other words, the variables have no influence on each other.

2. How do we determine if X*Y and X*Z are independent?

To determine if X*Y and X*Z are independent, we need to check if the joint probability distribution of the two variables is equal to the product of their individual probability distributions. If this is true, then X*Y and X*Z are independent.

3. Can X*Y and X*Z be independent if X, Y, and Z are not independent?

Yes, it is possible for X*Y and X*Z to be independent even if X, Y, and Z are not independent. This is because the multiplication of two independent variables can result in a new variable that is independent from the original variables.

4. What is the significance of knowing if X*Y and X*Z are independent?

Knowing if X*Y and X*Z are independent can help us make accurate predictions and assumptions in statistical analyses. It allows us to simplify calculations and assumptions, making it easier to interpret and understand the relationship between the variables.

5. Are there any specific conditions that must be met for X*Y and X*Z to be independent?

Yes, for X*Y and X*Z to be independent, the variables X, Y, and Z must be mutually independent, as well as have a joint probability distribution that is equal to the product of their individual probability distributions. Additionally, they must also be continuous random variables.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
10
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
12
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
4K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
454
  • Set Theory, Logic, Probability, Statistics
Replies
30
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
855
  • Set Theory, Logic, Probability, Statistics
Replies
11
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
2K
Back
Top