1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Mean of a sum of random variables

  1. Dec 16, 2014 #1
    1. The problem statement, all variables and given/known data
    If Y=X1+X2+...+XN prove that <Y>=<X1>+<X2>+...+<XN>

    2. Relevant equations
    <Y>=∫YP(Y)dY over all Y.

    3. The attempt at a solution
    I only seem to be able to show this if the Xi are independent, and I also think my proof may be very wrong. I basically have said that we can write the probability in the interval X1+dX1, X2+dX2,..., XN+dXN, as
    j=1nPXj(Xj)dXj (I really doubt this is right).
    Then
    <Y>=∫(∑i=1nXi)∏j=1nPXj(Xj)dXj
    =∑i=1n∫Xi∏j=1nPXj(Xj)dXj
    then all the integrals apart from the ith one go to one because the various probability functions are normalised so
    =∑i=1n∫XiPXi(Xi)dXi
    =∑i=1n<Xi>
    however in saying all the integrals go to one, I have assumed I could separate all the integrals, i.e that the variables were independent.

    Also, is there not a really easy way to prove this - I can't seem to find any books/websites proving it making me think it's just trivial...
     
    Last edited: Dec 16, 2014
  2. jcsd
  3. Dec 16, 2014 #2

    Orodruin

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Gold Member

    Your proof seems sound at first glance. It should be easily fixed for all distributions using P(A|B) P(B) = P(AB).
     
  4. Dec 16, 2014 #3

    Stephen Tashi

    User Avatar
    Science Advisor

    For example, if the joint density of [itex] x_1, x_2 [/itex] is [itex] f(x_1,x_2) [/itex]

    [itex]\int \int { (x_1 + x_2) f(x_1,x_2) } dx_1 dx_2 = \int \int {x_1 f(x_1,x_2)} dx_1 dx_2 + \int \int {x_2 f(x_1,x_2) } dx_1 dx_2 [/itex]

    Then an individual integals like [itex] \int \int x_2 f(x_1,x_2) dx_1 dx_2 [/itex] have the general pattern (expressed in different variables) of

    [itex] \int \int h(r) f(r,s) ds\ dr = \int\ h(r)\ ( \int f(r,s) ds )\ dr [/itex]

    The integration [itex] \int {f(r,s)} ds [/itex] produces the density function for [itex] r [/itex]. (It's integration of a joint density to produce a marginal density.)
     
  5. Dec 17, 2014 #4

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    The result is true in general, even if the variables ##X_1, X_2, \ldots, X_n## dependent. Look at the case ##n=2##, and take sums instead of integrals (because the result is also true for discrete random variables). Using the standard notation ##E## for expectation (instead of your Physics-oriented notation ##\langle \cdot \rangle##) we have, for a joint probability mass function ##P\{X_1 = k_1, X_2 =k_2 \} = p_{12}(k_1,k_2)##:
    [tex] E(X_1 + X_2) = \sum_{k_1,k_2} p_{12}(k_1,k_2) (k_1 + k_2)
    = \sum_{k_1,k_2} p_{12}(k_1,k_2) k_1 + \sum_{k_1,k_2} p_{12}(k_1,k_2) k_2 \\
    = \sum_{k_1} k_1\underbrace{ \left( \sum_{k_2} p_{12} (k_1,k_2) \right) }_{=p_1(k_1)}
    + \sum_{k_2} k_2 \underbrace{\left( \sum_{k_1} p_{12} (k_1,k_2) \right)}_{=p_2(k_2)} \\
    = E X_1 + E X_2 [/tex]
    Here, ##p_1(k_1) = P\{ X_1 = k_1 \}## and ##p_2(k_2) = P \{ X_2 = k_2 \}## are the marginal probability mass functions of ##X_1## and ##X_2## separately.
     
    Last edited: Dec 17, 2014
  6. Dec 17, 2014 #5
    Thanks for the replies. I have another issue related to the same setup... If we have n independent random variables given by Xi from i=1 to n, each with the same mean <X> and the same variance, how do we know <Xi2>=<X2>. I can't see this although I'm guessing it's obvious...
     
  7. Dec 17, 2014 #6

    Orodruin

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Gold Member

    Start from the definition of variance V(X) = <X^2> - <X>^2 (or, equivalently, V(X) = <(X - <X>)^2>).
     
  8. Dec 17, 2014 #7
    <Xi2>=<Xi>2+V(X)=<X>2+V(X)=<X2>?
     
  9. Dec 17, 2014 #8

    Orodruin

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Gold Member

    What is your definition of X?
     
  10. Dec 17, 2014 #9
    Some random variable that has a probability distribution with mean <X> and variance V(X)=<X2>-<X>2?
     
  11. Dec 17, 2014 #10

    Orodruin

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Gold Member

    I am just saying, because it is not clear if it is one of the Xi or not.
     
  12. Dec 17, 2014 #11
    No it isn't...
     
  13. Dec 17, 2014 #12

    Orodruin

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Gold Member

    Does it have the same variance as the Xi? If it does, you can just as well include it among them and otherwise the statement is not really correct.
     
  14. Dec 17, 2014 #13
    Well no I guess I have just assumed that and I shouldn't have, which means that proof doesn't work...
     
  15. Dec 17, 2014 #14

    Orodruin

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Gold Member

    What I am curious about is if this is the actual problem statement:
    Which would mean that <Xi> = <X>, but not necessarily V(Xi) = V(X), unless you also add a "V(X)" after "same variance".

    If V(Xi) = V(X), then the problem is trivial as you noticed. If it is not, then the statement is false.
     
  16. Dec 17, 2014 #15
    Oh I see... The actual statement is
    'each with the same mean <X> and the same variance σx2'
    but I didn't think defining what it actually was mattered but it clearly does. So I guess that makes it fine then.
    Thankyou.
     
  17. Dec 17, 2014 #16

    Stephen Tashi

    User Avatar
    Science Advisor

    [

    It isn't clear what the notation [itex] <X^2> [/itex] signifies. If each of [itex] X_i [/itex] has mean [itex] \mu [/itex] then it is not true that the mean value of [itex] X_i^2 [/itex] must equal [itex] \mu^2 [/itex]. If [itex]R [/itex] and [itex] W [/itex] are independent random variables that have the same mean value then it is not true that [itex] R^2 [/itex] and [itex] W^2 [/itex] must have the same mean value.
    If [itex] R [/itex] and [itex] W [/itex] are independent, identically distributed random variables, you could get that result.
     
  18. Dec 17, 2014 #17

    Orodruin

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Gold Member

    I disagree, <X^2> is standard notation for the expectation value of X^2, not for the square of the expectation value, which is normally written <X>^2. If the mean and variance of all of the stochastic variables are the same, then so is the expectation values of their squares.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Mean of a sum of random variables
Loading...