X and Y identically distributed implies E(X) = E(Y) and var(X) = var(Y)

  • Thread starter e(ho0n3
  • Start date
  • Tags
    Distributed
In summary, the homework statement is that X and Y have the same CDF but I don't have a definition of E[Xn] in terms of the CDF of X.
  • #1
e(ho0n3
1,357
0
Homework Statement
Let [itex](S, \Sigma, P)[/itex] be a probability space. Let X and Y be two random variables on S that satisfy [itex]P \circ X^{-1} = P \circ Y^{-1}[/itex] (i.e. they are identically distributed) and such that E(X), E(Y), var(X) and var(Y) exist and are finite. Prove that E(X) = E(Y) and var(X) = var(Y).

The attempt at a solution
I can only think of proving this the long and tedious way: first when X and Y are simple, then for X and Y bounded, then for X and Y nonnegative, and then finally for X and Y integrable. Is there an easier way?
 
Physics news on Phys.org
  • #2
Can you easily show X and Y have the same cdf? Then do you have a definition for E[X^n] in terms of the cdf of X?
 
  • #3
Billy Bob said:
Can you easily show X and Y have the same cdf? Then do you have a definition for E[X^n] in terms of the cdf of X?
They do have the same CDF but I don't have a definition of E[Xn] in terms of the CDF of X. All I know is that

[tex]
E[X^n] = \int_S X^n \, dP
[/tex]

whenever the integral exists.
 
  • #4
Do you have a theorem saying something like

[tex]\int f\, d(P\circ X^{-1})=\int (f\circ X)\, dP[/tex] ?

If so then you can use f(x)=x, f(x)=x^2, or even f(x)=x^n.

If you don't have such a theorem, you could prove it. I can't see how to avoid some of the "long" method. Maybe it is shorter than you thought. I don't think you have to approximate X and Y, just the two functions f(x)=x and f(x)=x^2. If you just want to do those two cases, use (or imagine using) a specific sequence of simple (step, in fact) functions.

Maybe you can get away with observing that for [tex]f= \chi_E[/tex] we have [tex]\chi_E\circ X=\chi_{X^{-1}(E)}[/tex], and then saying a magic phrase like "the result follows by linear combinations and limits." Probably not. I think it might be worth writing it out, just to verify my claim that X and Y don't need to be approximated.
 
  • #5
Billy Bob said:
Do you have a theorem saying something like

[tex]\int f\, d(P\circ X^{-1})=\int (f\circ X)\, dP[/tex] ?

If so then you can use f(x)=x, f(x)=x^2, or even f(x)=x^n.

If you don't have such a theorem, you could prove it.

I don't know of such a theorem. I'll try to prove that though (seems interesting).
 

1. What does it mean for X and Y to be identically distributed?

When two random variables, X and Y, are identically distributed, it means that they have the same probability distribution function. This means that the probability of X taking on a certain value is the same as the probability of Y taking on that same value. In other words, they have the same likelihood of occurring.

2. How does identically distributed X and Y lead to E(X) = E(Y)?

If X and Y are identically distributed, it means that they have the same probability distribution function. This also means that their expected values, E(X) and E(Y), are the same. This is because the expected value is calculated by multiplying each possible value by its probability of occurring, and since X and Y have the same probabilities for each value, their expected values will be the same.

3. What is the relationship between var(X) and var(Y) when X and Y are identically distributed?

If X and Y are identically distributed, it means that they have the same probability distribution function. This also means that their variances, var(X) and var(Y), are the same. This is because the variance is calculated by taking the average of the squared differences between each value and the expected value. Since X and Y have the same expected values, their variances will be the same.

4. Does X and Y being identically distributed imply that they have the same mean and variance?

Yes, if X and Y are identically distributed, it means that they have the same probability distribution function. This also means that their expected values (mean) and variances are the same. However, it is important to note that while identical distribution implies equal mean and variance, the reverse is not necessarily true. Two random variables can have the same mean and variance but not be identically distributed.

5. Can X and Y be identically distributed but have different probability distributions?

No, if X and Y are identically distributed, it means that they have the same probability distribution function. This means that they have the same probability of taking on each possible value. If they had different probability distributions, it would mean that they have different probabilities for each value, and therefore, they would not be identically distributed.

Similar threads

  • Calculus and Beyond Homework Help
Replies
8
Views
1K
  • Calculus and Beyond Homework Help
Replies
5
Views
616
  • Calculus and Beyond Homework Help
Replies
18
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
458
  • Calculus and Beyond Homework Help
Replies
9
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
267
  • Calculus and Beyond Homework Help
Replies
15
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
263
  • Calculus and Beyond Homework Help
Replies
3
Views
515
  • Calculus and Beyond Homework Help
Replies
7
Views
780
Back
Top