If X and Y are independent, are X^k and Y?

  • Context: Graduate 
  • Thread starter Thread starter AxiomOfChoice
  • Start date Start date
  • Tags Tags
    Independent
Click For Summary

Discussion Overview

The discussion centers on the independence of random variables, specifically whether the independence of two random variables X and Y implies the independence of their powers (X^k) and Y for various k. The scope includes measure-theoretic perspectives and the implications of independence in probability theory.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • Some participants express confusion over the measure-theoretic definition of independence and its implications for functions of random variables.
  • One participant notes that if X and Y are independent, then E[f(X)g(Y)] = E[f(X)]E[g(Y)] for any functions f and g, but questions how this holds from a measure-theoretic standpoint.
  • Another participant explains that expectations can be expressed as integrals involving probability density functions, leading to the conclusion that the joint density of independent variables is the product of their individual densities.
  • One participant challenges the relevance of the expectation result to the original question, emphasizing that uncorrelated random variables can still be dependent.
  • Another participant suggests using the probability expression P[X<=x,Y<=y] = P[X<=x]P[Y<=y] to demonstrate independence from a measure-theoretic perspective.
  • A later reply references a specific theorem from a book on stochastic calculus that provides a measure-theoretic proof of the independence of functions of independent random variables.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the implications of independence for functions of random variables, with multiple competing views and unresolved questions about the measure-theoretic aspects of the topic.

Contextual Notes

Participants highlight limitations in understanding the measurability of functions and the complexity of proving independence in the context of measure theory.

AxiomOfChoice
Messages
531
Reaction score
1
The definition of independence of random variables from a measure-theoretic standpoint is so confusing (independence of generated sigma-algebras, etc.) that I cannot answer this seemingly simple question...So if X,Y are independent random variables, does that mean X,X^2,X^3,X^4,\ldots and Y,Y^2,Y^3,\ldots are each pairwise independent?
 
Physics news on Phys.org
Okay...just looked this up in a book. In fact, if X,Y are independent, then we have E[f(X)g(Y)] = E[f(X)]E[g(Y)] for "any" functions f and g. But the proof is from a non-measure-theoretic probability book. Is there anyone who can explain why this holds from a measure-theoretic standpoint?
 
Expectations can be expressed as integrals involving the probability density functions. Since X and Y are independent, their joint density is simply the product of their individual densities, so the expectations involving functions of the random variables end up as the product of the individual expectations.
 
AxiomOfChoice said:
Okay...just looked this up in a book. In fact, if X,Y are independent, then we have E[f(X)g(Y)] = E[f(X)]E[g(Y)] for "any" functions f and g.

I don't know what that result has to do with your original question. The conclusion deals with the expectations of f(x) and g(y) not with their independence. Random variables can be uncorrelated and still be dependent.

I'm not an expert on measure theory but I did take the course years ago. I think answering your original post ( which concerns functions of a random variable) in detail is complicated. For example, not all functions are "measureable". The one's you listed are. How do we prove they are? Are you wanting an explanation that begins at elementary points like that? Or do you simply want a theorem from measure theory that answers your question as a special case?
 
It's probably easier to use P[X<=x,Y<=y] = P[X<=x]P[Y<=y] (which can be obtained from the measure-theoretic definition of independence by considering generators of the Borel sigma algebras for R and R^2).
 
Let X and Y be independent random variable and f and g be borel-measurable functions on R, there is a measure-theoretic way of proof that f(X) and g(Y) are independent r.v. in Shreve's book "Stochastic Calculus for Finance II", see Theorem 2.2.5
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 8 ·
Replies
8
Views
3K