X and Y identically distributed implies E(X) = E(Y) and var(X) = var(Y)

  • Thread starter Thread starter e(ho0n3
  • Start date Start date
  • Tags Tags
    Distributed
Click For Summary

Homework Help Overview

The problem involves proving that if two random variables, X and Y, are identically distributed within a probability space, then their expected values and variances are equal. The context is rooted in probability theory and the properties of random variables.

Discussion Character

  • Exploratory, Conceptual clarification, Mathematical reasoning

Approaches and Questions Raised

  • Participants discuss various methods to approach the proof, including starting with simple cases and building up to more complex scenarios. Questions arise about the relationship between the cumulative distribution functions (CDFs) of X and Y, and whether definitions of expected values can be expressed in terms of these CDFs.

Discussion Status

Some participants suggest exploring the properties of CDFs and theorems related to integrals involving random variables. There is acknowledgment of the potential complexity of the proof, with some expressing uncertainty about the necessity of approximating the random variables.

Contextual Notes

Participants note the existence of certain theorems that may be relevant to the proof, while others express a lack of familiarity with these theorems and indicate a willingness to explore them further.

e(ho0n3
Messages
1,349
Reaction score
0
Homework Statement
Let [itex](S, \Sigma, P)[/itex] be a probability space. Let X and Y be two random variables on S that satisfy [itex]P \circ X^{-1} = P \circ Y^{-1}[/itex] (i.e. they are identically distributed) and such that E(X), E(Y), var(X) and var(Y) exist and are finite. Prove that E(X) = E(Y) and var(X) = var(Y).

The attempt at a solution
I can only think of proving this the long and tedious way: first when X and Y are simple, then for X and Y bounded, then for X and Y nonnegative, and then finally for X and Y integrable. Is there an easier way?
 
Physics news on Phys.org
Can you easily show X and Y have the same cdf? Then do you have a definition for E[X^n] in terms of the cdf of X?
 
Billy Bob said:
Can you easily show X and Y have the same cdf? Then do you have a definition for E[X^n] in terms of the cdf of X?
They do have the same CDF but I don't have a definition of E[Xn] in terms of the CDF of X. All I know is that

[tex] E[X^n] = \int_S X^n \, dP[/tex]

whenever the integral exists.
 
Do you have a theorem saying something like

[tex]\int f\, d(P\circ X^{-1})=\int (f\circ X)\, dP[/tex] ?

If so then you can use f(x)=x, f(x)=x^2, or even f(x)=x^n.

If you don't have such a theorem, you could prove it. I can't see how to avoid some of the "long" method. Maybe it is shorter than you thought. I don't think you have to approximate X and Y, just the two functions f(x)=x and f(x)=x^2. If you just want to do those two cases, use (or imagine using) a specific sequence of simple (step, in fact) functions.

Maybe you can get away with observing that for [tex]f= \chi_E[/tex] we have [tex]\chi_E\circ X=\chi_{X^{-1}(E)}[/tex], and then saying a magic phrase like "the result follows by linear combinations and limits." Probably not. I think it might be worth writing it out, just to verify my claim that X and Y don't need to be approximated.
 
Billy Bob said:
Do you have a theorem saying something like

[tex]\int f\, d(P\circ X^{-1})=\int (f\circ X)\, dP[/tex] ?

If so then you can use f(x)=x, f(x)=x^2, or even f(x)=x^n.

If you don't have such a theorem, you could prove it.

I don't know of such a theorem. I'll try to prove that though (seems interesting).
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 1 ·
Replies
1
Views
496
  • · Replies 4 ·
Replies
4
Views
1K
Replies
9
Views
4K
  • · Replies 15 ·
Replies
15
Views
2K
Replies
5
Views
2K
  • · Replies 18 ·
Replies
18
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
7
Views
1K
Replies
1
Views
1K