Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Expectation over Sets

  1. Nov 7, 2013 #1
    I'm having trouble working out a few details from my probability book. It says if P(An) goes to zero, then the integral of X over An goes to zero as well. My book says its because of the monotone convergence theorem, but this confuses me because I thought that has to do with Xn converging to X. Here is my attempt anyway:

    We can write the integration as lim n to infinity of E[X(I(An))] = E(lim n to infinity of X P(An)) = E(X(0)) = E(0) = 0.
     
  2. jcsd
  3. Nov 8, 2013 #2

    mathman

    User Avatar
    Science Advisor
    Gold Member

    It would be helpful if you defined the symbols. An, X, Xn, etc.?
     
  4. Nov 9, 2013 #3
    Let An be a sequence of sets indexed by n. As n goes to infinity, P(An) = 0. Let X be a random variable. Prove that the limit as n goes to infinity of the integral of X over An, with respect to probabiliity measure P, equals zero. I thought that the monotone convergence theorem applies to when a sequence of random variables Xn converges to X, and you can interchange limit taking and integral taking (expectation). How does it apply here?

    Thanks.
     
  5. Nov 9, 2013 #4

    verty

    User Avatar
    Homework Helper

    This may be totally wrong, but if the probability of any ##A_i## is greater than 0, then surely the integral is greater than 0, being the probability of a superset? I suppose this must be wrong but it is the most obvious way to interpret the question.
     
  6. Nov 9, 2013 #5
    I do agree that the integral of anything over a set of measure zero is zero. However, if X takes the value 0 over a set B with P(B) > 0, than the expectation (integral) of X over that set B is zero. The concept that an integral over a set with probability zero is zero is certainly intuitive, it is the details of the proof that I am wondering about, namely, when can you apply the limit and let n go to infinity? Surely, we cannot just suddenly apply the limit immediately, but must calculate the integral somehow, perhaps how I tried it posted above.
     
  7. Nov 9, 2013 #6

    mathman

    User Avatar
    Science Advisor
    Gold Member

    The "theorem" is false. Example:

    Let X be a random variable with a Cauchy distribution. An = set of points where X > n. P(An) -> 0. However the integral of X over An is infinite for all n.
     
  8. Nov 9, 2013 #7
    You are right. Supposing I said, assume E(X) is finite. Does that take care of this counterexample and issue?
     
  9. Nov 9, 2013 #8

    Office_Shredder

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    When you say the integral of X over An, do you mean
    [tex] \int_{A_n} x p(x) dx [/tex]
    or
    [tex] \int_{A_n} p(x) dx [/tex]

    where p(x) is the distribution?
     
  10. Nov 9, 2013 #9
    The first one, I believe. My book usually has it written as X dP, but what you have written in the first one is equivalent, correct? And then you just put the limit taking as n -> infinity before the integral.




    How do you know P({X>n}) --> 0? Is this true for all random variables with a finite expectation?
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Expectation over Sets
Loading...