Expected Value of Positive-Valued RV

Click For Summary

Homework Help Overview

The discussion revolves around proving that for a positive-valued random variable (RV) X, the inequality E(X^k) ≥ E(X)^k holds for all k≥1. Participants are exploring the implications of this inequality, particularly in cases where the expected value diverges.

Discussion Character

  • Conceptual clarification, Assumption checking, Mixed

Approaches and Questions Raised

  • Some participants present a counter-example involving a positive-valued RV with a diverging expected value, questioning the validity of the proof under such conditions. Others discuss the implications of infinite expectations and whether comparisons can be made when expectations do not exist as real numbers.

Discussion Status

The discussion is ongoing, with participants examining different interpretations of the problem and the conditions under which the inequality might hold. Some suggest adding convergence hypotheses to clarify the proof, while others debate the assumptions made regarding the distribution of the RV.

Contextual Notes

There is a focus on the nature of expectations, particularly in cases where E[X] = ∞, and the implications for E[X^k]. Participants are considering the distinction between different forms of non-existence of expectations and how that affects the proof.

TranscendArcu
Messages
277
Reaction score
0

Homework Statement


Prove that if X is a positive-valued RV, then E(X^k) ≥ E(X)^k for all k≥1

The Attempt at a Solution


Why do I feel like this is a counter-example:

X = {1,2,4,8,16,...} (A positive-valued RV)
m(X) = {1/2,1/4,1/16,1/32,...} (A distribution function that sums to one)

Yet clearly,

E[X] = \sum _{k=1} ^{∞} \frac{1}{2^k} {2}^{k-1} = \frac{1}{2} + \frac{1}{2} + \frac{1}{2} + \frac{1}{2} + ... = ∞

So the expected value diverges (ie. doesn't exist). So I can't do the proof because for this RV, the expectation DNE.
 
Last edited:
Physics news on Phys.org
TranscendArcu said:

Homework Statement


Prove that if X is a positive-valued RV, then E(X^k) ≥ E(X)^k for all k≥1

The Attempt at a Solution


Why do I feel like this is a counter-example:

X = {1,2,4,8,16,...} (A positive-valued RV)
m(X) = {1/2,1/4,1/16,1/32,...} (A distribution function that sums to one)

Yet clearly,

E[X] = \sum _{k=1} ^{∞} \frac{1}{2^k} {2}^{k-1} = \frac{1}{2} + \frac{1}{2} + \frac{1}{2} + \frac{1}{2} + ... = ∞

So the expected value diverges (ie. doesn't exist). So I can't do the proof because for this RV, the expectation DNE.

This is also an allowable case if you accept that ∞ ≥ ∞, and that EX = ∞ implies EX^k = ∞ for any k > 1.

However, in the case that EX < ∞ and EX^k < ∞, can you do the proof then?

RGV
 
TranscendArcu said:

Homework Statement


Prove that if X is a positive-valued RV, then E(X^k) ≥ E(X)^k for all k≥1

The Attempt at a Solution


Why do I feel like this is a counter-example:

X = {1,2,4,8,16,...} (A positive-valued RV)
m(X) = {1/2,1/4,1/16,1/32,...} (A distribution function that sums to one)

Yet clearly,

E[X] = \sum _{k=1} ^{∞} \frac{1}{2^k} {2}^{k-1} = \frac{1}{2} + \frac{1}{2} + \frac{1}{2} + \frac{1}{2} + ... = ∞

So the expected value diverges (ie. doesn't exist). So I can't do the proof because for this RV, the expectation DNE.

Why can't you? Both E[X^2] and E[X]^2 are infinite, right?
 
So I went and got this from Wikipedia:

Skjermbilde_2012_07_28_kl_4_39_15_PM.png


I wouldn't have much of a problem accepting ∞ ≥ ∞ if I thought that the expectation of such an RV even existed. Since the expectation diverges in my example, it seems meaningless to me to discuss comparisons of its nonexistent expected value.
 
I suspect you have over-complicated things... surely it is not true, in general, that the sum of powers is equal to the power of the sum - especially if all elements in the sum are positive? Consider if X is drawn from a set of two values for instance...

That's if I've read it correctly that you have to show:
E(X^k)=\frac{1}{N}\sum_{i=1}^{N}(x_i)^k = \bigg ( \frac{1}{N}\sum_{i=1}^{N}x_i \bigg )^k = \bigg ( E(X) \bigg ) ^k<br /> <br />
 
TranscendArcu said:
So I went and got this from Wikipedia:

Skjermbilde_2012_07_28_kl_4_39_15_PM.png


I wouldn't have much of a problem accepting ∞ ≥ ∞ if I thought that the expectation of such an RV even existed. Since the expectation diverges in my example, it seems meaningless to me to discuss comparisons of its nonexistent expected value.

Opinions vary, and not all books would agree with that Wiki article. Of course, the expectation does not exist (as a real number), but in such a case we sometimes write EX = ∞ and pretend that ∞ is in an extended real number field.

Note that there are essentially two kinds of "EX does not exist": (1) EX = ∞; and (2) for X in all of ℝ, with X = X+ - X-, with X± ≥ 0 and EX+ = EX- = ∞ (that is, EX would be of the form ∞ - ∞).

Anyway, if you don't like this, add finiteness statements to the hypotheses. That still leaves you with something to prove.

RGV
 
Simon Bridge said:
I suspect you have over-complicated things... surely it is not true, in general, that the sum of powers is equal to the power of the sum - especially if all elements in the sum are positive? Consider if X is drawn from a set of two values for instance...

That's if I've read it correctly that you have to show:
E(X^k)=\frac{1}{N}\sum_{i=1}^{N}(x_i)^k = \bigg ( \frac{1}{N}\sum_{i=1}^{N}x_i \bigg )^k = \bigg ( E(X) \bigg ) ^k<br /> <br />
I don't think that's what I'm trying to show. I want to show a weak inequality, as opposed to a strict equality. Also, you seem to have assumed a uniform pdf (ie. the 1/N), which does not seem a fair assumption to me.
 
Ray Vickson said:
Opinions vary, and not all books would agree with that Wiki article. Of course, the expectation does not exist (as a real number), but in such a case we sometimes write EX = ∞ and pretend that ∞ is in an extended real number field.

Note that there are essentially two kinds of "EX does not exist": (1) EX = ∞; and (2) for X in all of ℝ, with X = X+ - X-, with X± ≥ 0 and EX+ = EX- = ∞ (that is, EX would be of the form ∞ - ∞).

Anyway, if you don't like this, add finiteness statements to the hypotheses. That still leaves you with something to prove.

RGV
Yes, I think I'll have to add convergence hypotheses. In which case, the proof is immediate from Jensen's Inequality.
 
TranscendArcu said:
Yes, I think I'll have to add convergence hypotheses. In which case, the proof is immediate from Jensen's Inequality.

I think you also need to show that EX = ∞ implies EX^k =∞ for all k > 1; that is, if EX does not exist then neither does EX^k.

RGV
 

Similar threads

  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
Replies
8
Views
1K
Replies
21
Views
2K
  • · Replies 4 ·
Replies
4
Views
1K
Replies
4
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K