1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Convergence of Mean Question

  1. Feb 22, 2013 #1
    1. The problem statement, all variables and given/known data

    Let [itex]X_1, X_2...[/itex] be a sequence of independent random variables with [itex]E(X_i)=\mu_i[/itex] and [itex](1/n)\sum(\mu_i)\rightarrow\mu[/itex]

    Show that [itex]\overline{X}\rightarrow\mu[/itex] in probability.


    2. Relevant equations

    NA

    3. The attempt at a solution

    I feel as if this shouldn't be too hard, but I'm having some trouble getting started. Any point in the right direction would be helpful. Thanks!
     
  2. jcsd
  3. Feb 22, 2013 #2
    Or rather, the better question might be. Is this as easy as it looks? Namely, is it as simple as showing that, because of the equality given, X-bar = mu and so they must converge?
     
  4. Feb 22, 2013 #3

    jbunniii

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Is any information given about the variances of ##X_i##? Are they known to be finite, for example?
     
  5. Feb 22, 2013 #4
    Well, this was a little confusing. The previous question also gives that
    [itex]Var(X_i)=\sigma^2[/itex].

    However, it also says that the sum of the variances converges to 0, which clearly means that [itex]\overline{X}\rightarrow\mu[/itex] as well (and this is the key to the first problem, which asks me to show the same thing, i.e. that [itex]\overline{X}\rightarrow\mu[/itex]).

    The second question says simply to "Let [itex]X_i[/itex] be as in Problem 1" and then adds the additional information I included in the OP.

    So I guess I assume that [itex]Var(X_i)=\sigma^2[/itex] still applies, but not the bit about the sum of the variances converging to 0 (because this would make the second problem as trivial as the first), and that instead we're supposed to use the new fact about the expected values converging to [itex]\mu[/itex].

    What do you think?
     
    Last edited: Feb 22, 2013
  6. Feb 22, 2013 #5

    micromass

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor
    2016 Award

    So they actually ask you to prove the WLLN in a problem??
     
  7. Feb 22, 2013 #6
    I suppose so, and this is what I found confusing. Because it seems quite trivial and I'm wondering if I am missing something here.

    I apply Chebyshev's inequality with the given information and I'm done, right?
     
  8. Feb 22, 2013 #7

    micromass

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor
    2016 Award

    Chebyshev requires some knowledge of the variance though. So you got to know it's finite. In that case, the proof is indeed trivial.

    However, some versions of the WLLN also holds without any knowledge of the variance. For example, I know that if [itex]X_n[/itex] are iid and if the first moment is finite, then you can also prove that the means converge in probability.
    Needless to say that the proof is going to be more difficult, since you can't apply Chebyshev.
     
  9. Feb 22, 2013 #8
    So carrying over that [itex]Var(X_i)=\sigma^2[/itex] is of no use here? From that can't I just use Chebyshev?

    I believe that this fact is supposed to carry over from the previous problem.
     
  10. Feb 22, 2013 #9

    micromass

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor
    2016 Award

    If you can indeed carry over [itex]Var(X_i)=\sigma^2[/itex], then you can use Chebyshev and your proof is valid.
    Whether you can carry over [itex]Var(X_i)=\sigma^2[/itex] is something I don't know, you should ask your professor. But I think that you probably can.
     
  11. Feb 22, 2013 #10
    I'll go with that. I envision the proof being quite challenging if this is not possible (and probably beyond the scope of the class). Thanks for your the help, everyone.
     
  12. Feb 22, 2013 #11

    jbunniii

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    And the proof using characteristic functions, which does not require finite variance, can't be applied directly because we aren't in an iid situation. We will still have ##\phi_{x+y} = \phi_{x}\phi_{y}##, but this won't equal ##\phi_x^2##.
     
  13. Feb 22, 2013 #12

    micromass

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor
    2016 Award

    The proof without the variance condition is not very difficult too understand. But it wouldn't be suitable as an exercise.
     
  14. Feb 22, 2013 #13
    Got it. If you don't mind indulging me, what's the gist of it?
     
  15. Feb 22, 2013 #14

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    The wonderful book W.Feller, "An Introduction to Probability Theory and its Applications", Vol. I, (Wiley) has a great chapter (Chapter X) on this and other limit properties. The way he does it is to look at a method of truncation:
    [tex] U_k = X_k, \text{ and }V_k = 0 \text{ if } |X_k| \leq \delta n\\
    U_k = 0, \text{ and } V_k = X_k \text{ if } |X_k| > \delta n,[/tex]
    where ##\delta > 0## is a parameter to be determined later, and ##k = 1,2,\ldots, n##. This implies ##X_k = U_k + V_k##. He then proves the Law of Large Numbers by showing that for any given ##\epsilon > 0## the constant ##\delta > 0## can be chosen so that as ##n \to \infty##,
    [tex] P\{ |U_1 + \cdots + U_n| > \frac{1}{2} \epsilon n \} \to 0\\
    \text{ and }\\
    P\{ |V_1 + \cdots + V_n | > \frac{1}{2} \epsilon n \} \to 0.[/tex]
    The Law of Large Numbers follows from these.

    The point of the truncation is to permit some statements to be made and proved even when variance is infinite---it works as long as ##E X## is finite. For proof of the above crucial results, consult Feller.
     
  16. Feb 24, 2013 #15
    This is good stuff; I'll have to take a look at the text. Thanks a lot!
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Convergence of Mean Question
  1. Convergence question (Replies: 1)

  2. Convergence question (Replies: 1)

  3. Convergence question (Replies: 4)

  4. Convergence Questions (Replies: 0)

Loading...