Show that X+Y has a finite second moment

In summary, a finite second moment refers to the variance of a random variable and is determined by the integral of its squared values. For X+Y, the finite second moment can be calculated by adding the individual variances of X and Y and the covariance between them. Having a finite second moment for X+Y indicates a finite limit to the variation of its values, which is important for understanding its behavior and potential outcomes. It is also a necessary condition for many statistical techniques and models to be applicable. While X and Y may not individually have a finite second moment, it is possible for X+Y to have one due to the influence of their covariance.
  • #1
kingwinner
1,270
0
Prove that if X and Y have finite second moments (i.e. E(X^2) and E(Y^2) are finite), then X+Y has a finite second moment.


(X+Y)^2 ≤ X^2 + Y^2 + 2|XY|
=> E[(X+Y)^2] ≤ E(X^2) + E(Y^2) + 2E(|XY|)

I don't understand the (probably incomplete) proof. On the right side, E(X^2) and E(Y^2) are finite, but how can we know whether E(|XY|) is finite or not?

Thanks for explaining!
 
Physics news on Phys.org
  • #2
kingwinner said:
Prove that if X and Y have finite second moments (i.e. E(X^2) and E(Y^2) are finite), then X+Y has a finite second moment.


(X+Y)^2 ≤ X^2 + Y^2 + 2|XY|
=> E[(X+Y)^2] ≤ E(X^2) + E(Y^2) + 2E(|XY|)

I don't understand the (probably incomplete) proof. On the right side, E(X^2) and E(Y^2) are finite, but how can we know whether E(|XY|) is finite or not?

Thanks for explaining!

This uses a classic inequality (really from measure theory, but applied to probability).

Essentially, if both [tex] X, [/tex] have finite second moments (variances exist) then they have finite moments of every lower order. For your specific case:

[tex]
E[|XY|]^2 \le E(X^2) E(Y^2) < \infty
[/tex]

where the RHS is finite because of the assumptions about the second-order moments being finite.
 
  • #3
Hi statdad,

statdad said:
This uses a classic inequality (really from measure theory, but applied to probability).
Is it in any way related to the Cauchy-Schwartz inequality?

Essentially, if both [tex] X, [/tex] have finite second moments (variances exist) then they have finite moments of every lower order.
I don't see how this fact can be applied to our problem. E(|XY|), E(X^2), and E(Y^2) are all second moments, right?

For your specific case:
[tex]
E[|XY|]^2 \le E(X^2) E(Y^2) < \infty
[/tex]
where the RHS is finite because of the assumptions about the second-order moments being finite.
For the left side of the inequality, do you mean E(|XY|^2) or [E(|XY|)]^2 ?


Thanks for your help!:smile:
 
  • #4
I am still stuck on this problem and would appreicate if anyone could help me out...
 
  • #5
Sorry for the delay - no excuse on my part.

Yes, as you pointed out, the moment-inequality is from the function version of the C-S inequality. My post should read

[tex]
E[|XY|^2] \le E(X^2) E(Y^2)
[/tex]
 
  • #6
statdad said:
Sorry for the delay - no excuse on my part.

Yes, as you pointed out, the moment-inequality is from the function version of the C-S inequality. My post should read

[tex]
E[|XY|^2] \le E(X^2) E(Y^2)
[/tex]
That's OK, don't worry.

But the version of C-S inequality that I've seen in Wikipedia is the following:
|E(XY)|^2 ≤ E(X^2) E(Y^2)
3c2d62f6a6a33c74752cd006b8034541.png

http://en.wikipedia.org/wiki/Cauchy–Schwarz_inequality#Probability_theory

We have:
(X+Y)^2 ≤ X^2 + Y^2 + 2|XY|
E[(X+Y)^2] ≤ E(X^2) + E(Y^2) + 2E(|XY|)
We are given that E(X^2) and E(Y^2) are both finite, but how can we show that E(|XY|) is finite?
For E(|XY|), here we have the absolute value inside the expectation, but for the left side of the C-S inequality, the absolute value is outside.Thanks for your help!
 
Last edited by a moderator:
  • #7
Consider [tex] X [/tex] - the proof for [/tex] Y [/tex] is similar.

[tex]
E[|X|]^2 = E[|X| \cdot 1 ]^2 = \left(\int |x| \cdot 1 \, dF(x)\right)^2 \le \left(\int |x|^2 \, dF(x)\right)^2 \cdot \left(\int 1 \, dF(x)\right)^2 = E[X^2] < \infty
[/tex]

so the existence of a finite second moment gives the existence of the finite first moment.
 
  • #8
statdad said:
Consider [tex] X [/tex] - the proof for [/tex] Y [/tex] is similar.

[tex]
E[|X|]^2 = E[|X| \cdot 1 ]^2 = \left(\int |x| \cdot 1 \, dF(x)\right)^2 \le \left(\int |x|^2 \, dF(x)\right)^2 \cdot \left(\int 1 \, dF(x)\right)^2 = E[X^2] < \infty
[/tex]

so the existence of a finite second moment gives the existence of the finite first moment.
??But E(|X|2) = E(X2) always, no? (since |X|2 = X2)

Also, I don't see how the C-S inequality
3c2d62f6a6a33c74752cd006b8034541.png
would necessarily imply that E(|XY|2) ≤ E(X2)E(Y2) as you said in post #5. And how can we use this to prove that E(|XY|) is finite? Could you please explain this part?

Thanks a lot!:smile:
 
Last edited by a moderator:
  • #9
kingwinner said:
Prove that if X and Y have finite second moments (i.e. E(X^2) and E(Y^2) are finite), then X+Y has a finite second moment.


(X+Y)^2 ≤ X^2 + Y^2 + 2|XY|
=> E[(X+Y)^2] ≤ E(X^2) + E(Y^2) + 2E(|XY|)

I don't understand the (probably incomplete) proof. On the right side, E(X^2) and E(Y^2) are finite, but how can we know whether E(|XY|) is finite or not?

Thanks for explaining!

A quick method is |X+Y| <= 2 max(|X|,|Y|), so (X+Y)^2 <= 4 max(X^2,Y^2) <= 4(X^2+Y^2)
=> E[(X+Y)^2] <= 4E[X^2] + 4E[Y^2] < infinity
the 4 can be replaced by 2, but this is enough.
 

1. What is the definition of a finite second moment?

A finite second moment refers to a mathematical property of a random variable that measures the spread or variability of its values. It is also known as the variance and is denoted by Var(X) or σ². A random variable has a finite second moment if the integral of its squared values is finite.

2. How is the finite second moment calculated for X+Y?

The finite second moment for X+Y is calculated by taking the square of the sum of the individual variances of X and Y, and adding to it the covariance between X and Y. This can be represented by the formula Var(X+Y) = Var(X) + Var(Y) + 2Cov(X,Y), where Cov(X,Y) is the covariance between X and Y.

3. What does it mean for X+Y to have a finite second moment?

If X+Y has a finite second moment, it means that the sum of the squared values of X and Y is finite. This indicates that the values of X+Y are not too spread out, and there is a finite limit to the amount of variation in the values. This is an important property in probability and statistics, as it helps determine the stability and predictability of the random variable.

4. Why is it important to show that X+Y has a finite second moment?

Showing that X+Y has a finite second moment is important because it provides information about the behavior and characteristics of the random variable. It helps in understanding the distribution of the variable and its potential outcomes. Additionally, a finite second moment is a necessary condition for many statistical techniques and models to be applicable, making it an essential concept in the field of probability and statistics.

5. Can X+Y have a finite second moment if X and Y individually do not?

Yes, it is possible for X+Y to have a finite second moment even if X and Y individually do not. This is because the finite second moment of X+Y is not solely determined by the individual variances of X and Y, but also by the covariance between them. If the covariance is small enough, it can compensate for the infinite variance of one of the variables, resulting in a finite second moment for X+Y.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
30
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
926
  • Set Theory, Logic, Probability, Statistics
2
Replies
43
Views
4K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
9
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
744
  • Set Theory, Logic, Probability, Statistics
Replies
0
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
Back
Top